Game Discussions Bethesda Softworks Starfield Space RPG

You know how geothermic "volcanism" can create bizarre formations? There is liquid carbohydrates under ice planet surfaces. Tidal force can create cracks and force these out to eject "into the sky" then when the low temperature set in the clouds and steam condenses or goes straight to solid state and that must be very strange. The plain ice world with snow storm - that is nothing ice worlds really look like, it's rather a lot more bizarre.
I don't think we can apply too much realism in this game, it's more about the experience, and as long it's not in Nomans Sky territory I'm good.
 
First mods I'll look for will probably be bicycles, destructible children, and killable terrain. Should be a pretty wide selection by the time I find an acceptable means to play the game.

I don't even know what monitor resolution is deemed '4k'. Pretty sure I don't want to know either.

3840*2160!

who's paining an RX6800XT with an I5-10600 or Ryzet 5 3600x ???)

Anyone who already has those CPUs and wants the most economical upper-midrange gaming upgrade. Those are still fairly capable CPUs and the 6800 XT is probably the best GPU price/performance ratio out there at the moment.

My 6800 XT will eventually make it into my i7-5820K or even Xeon X5670 box, before I completely retire those systems, because I usually upgrade GPUs faster than I do motherboards.

Trying to preach to an Xbox series X owner and player what an Xbox does, is or isn't...have you ever seen an Xbox...never mind actually used one?..The comparison of hardware specs are totally irrelevant...no matter what you read on some PC master race fan boy site...they're both engineered and designed to use the hardware completely differently.

I don't own any modern consoles, but I have played some cross platform games on them.

I'd wager that if I built a PC, with as similar hardware specs to a given modern console as practical, it would run most cross platform games similarly well. There would be some differences, but most of the problems with overhead are reflected in OS memory footprint, not the same kind of code running inherently slower on Windows or the D3D12/Vulkan APIs than on console OSes and APIs. Larger gaps would have been expected in the past with less efficient graphics APIs on PC and comparatively lower level tuning on consoles, but that's less the case today.

There are some astonishingly bad ports, but I'd argue that if one is going to look at the relative gaming capabilities of the platfoms, and what the fundamental performance costs of the OS, drivers, and APIs actually are, one should use the best ports, not the sloppiest, for such a comparison.
 
I don't think we can apply too much realism in this game, it's more about the experience, and as long it's not in Nomans Sky territory I'm good.
I agree about too much realism - I rather meant taking artistic inspiration from nature. Or literature. Mass Effect Andromeda featured a couple peculiar landmarks on ice planets that were not so commonly envisioned in video game art. Or SWtoR MMO - it had a "forest" of ice crystals, e.g.
 
Steam survey for May shows first AMD GPU not even reached Top 20. It's all Nvidia.

Don't think AMD has a future with out FSR 3.0. Nvidia is just so far ahead, and this is just the start. Can't wait to see what 50 series Nvidia card will bring next year.
Also AMD blocking DLSS is just sad, but a good reason to switch to Nvidia.

I have not been more impressed with video card like i was with 40 series. It consumes sooo much less power, so quite and i swear it run 10 degree cooler than 30 series card, while having massive performance boost with DLSS 3. A real step in to the future :cool:
 
Steam survey for May shows first AMD GPU not even reached Top 20. It's all Nvidia.

Don't think AMD has a future with out FSR 3.0. Nvidia is just so far ahead, and this is just the start. Can't wait to see what 50 series Nvidia card will bring next year.
Also AMD blocking DLSS is just sad, but a good reason to switch to Nvidia.

I have not been more impressed with video card like i was with 40 series. It consumes sooo much less power, so quite and i swear it run 10 degree cooler than 30 series card, while having massive performance boost with DLSS 3. A real step in to the future :cool:
You keep saying that, basically because being unnaturally fixated on some software upscaler, you were daft enough to pay Nvidia's extortionate prices to keep riding on the big old 40 series hype train that some of us jumped off at the last stop...looking to Steam for a realistic hardware review isn't really a good sign...

Maybe all this Nvidia hype will come true if you say it often enough...try saying it three times, spinning around then touch your toes...might work 😗
 
Last edited:
Steam survey for May shows first AMD GPU not even reached Top 20. It's all Nvidia.

Don't think AMD has a future with out FSR 3.0. Nvidia is just so far ahead, and this is just the start. Can't wait to see what 50 series Nvidia card will bring next year.
Also AMD blocking DLSS is just sad, but a good reason to switch to Nvidia.

I have not been more impressed with video card like i was with 40 series. It consumes sooo much less power, so quite and i swear it run 10 degree cooler than 30 series card, while having massive performance boost with DLSS 3. A real step in to the future :cool:
I've always been for the Gforce cards, however after a lot of research my next mid GPU will be AMD, because it is cheaper, and in many case without RT performing better regarding raw FPS, I really don't care about RT as many of the games I play it doesn't matter, CPF (cost per frame) is what I'm looking at. FSR or DLSS are nice features but not what I'm looking for when I'm building a gaming PC.
 
You keep saying that, basically because being unnaturally fixated on some software upscaler, you were daft enough to pay Nvidia's extortionate prices to keep riding on the big old 40 series hype train that some of us jumped off at the last stop...looking to Steam for a realistic hardware review isn't really a good sign...

Maybe all this Nvidia hype will come true if you say it often enough...try saying it three times, spinning around then touch your toes...might work 😗
Was playing a game right now. Went in to a City, GPU load went to 90%. Turned on Frame Gen, back to 50%. A miracle software and hardware combination came together to give the best gaming experience to date.

I've always been for the Gforce cards, however after a lot of research my next mid GPU will be AMD, because it is cheaper, and in many case without RT performing better regarding raw FPS, I really don't care about RT as many of the games I play it doesn't matter, CPF (cost per frame) is what I'm looking at. FSR or DLSS are nice features but not what I'm looking for when I'm building a gaming PC.

Graphic card market right now is very unstable. GPU prices specially AMD drops from month to month. If you wait a year you can buy same card for 40% off if not more. Specially if you are aiming for old generation 6000 series. But if you are set for AMD, at least wait for FSR 3.0 release and testing. If they can't make it work and like Nvidia would require next gen cards, you will be switching that card soon after. Specially if you are after CPF.
Just be patient, save some extra money and you will be much happier later on.
 
Was playing a game right now. Went in to a City, GPU load went to 90%. Turned on Frame Gen, back to 50%. A miracle software and hardware combination came together to give the best gaming experience to date.



Graphic card market right now is very unstable. GPU prices specially AMD drops from month to month. If you wait a year you can buy same card for 40% off if not more. Specially if you are aiming for old generation 6000 series. But if you are set for AMD, at least wait for FSR 3.0 release and testing. If they can't make it work and like Nvidia would require next gen cards, you will be switching that card soon after. Specially if you are after CPF.
Just be patient, save some extra money and you will be much happier later on.
So you keep saying...I'm not that fussed about FSR personally...or any software upscaling... since I rarely use it. I already have one of the 'older' 6000 series cards, a Red Devil RX 6950xt.

The easiest way to save power and GPU usage is to limit the framerate in the game settings (or AMD Adrenaline software in my case) instead of getting fixated about little numbers in the corner of your monitor...something like limiting to 100fps works fine for me, drops my GPU usage from 90%+ to around 75-80% in the majority of GPU intensive games...FSR or no FSR 🤷‍♂️
 
Last edited:
So you keep saying...I'm not that fussed about FSR personally...or any software upscaling... since I rarely use it. I already have one of the older 6000 series cards, an RX 6950xt.

The easiest way to save power and GPU usage is to limit the framerate in the game settings (or AMD Adrenaline software in my case) instead of getting fixated about little numbers in the corner of your monitor...something like limiting to 100fps works fine for me, drops my GPU usage from 90%+ to around 75-80% in the majority of GPU intensive games...FSR or no FSR 🤷‍♂️
I always cap FPS. I never like runing neither GPU or CPU on high loads. That's why i was so surprised to see the 90% load, but thankfully Nvidia's Frame Generation was there to rescue the situation.
This is what so great about it, the Ai is just simply there to assist in enhancing your gaming experience, its not trying to take over the world :sneaky:.
 
I always cap FPS. I never like runing neither GPU or CPU on high loads. That's why i was so surprised to see the 90% load, but thankfully Nvidia's Frame Generation was there to rescue the situation.
This is what so great about it, the Ai is just simply there to assist in enhancing your gaming experience, its not trying to take over the world :sneaky:.
It is nice, no denying it...but I'm sure AMD will come up with something similar with FSR when they get around to it. You seem to have a thing about AMD GPU's for some reason...did they come around and kick your dog or something? I went over to AMD from Nvidia strictly for the bang for buck reason...If I wanted the equivalent performance to my $600 RX 6950xt from an Nvidia card, I'd be 2 grand poorer and be saddled with a 420W power sucking house brick of a 3090ti that has a heat signature visible from space...

Like most of us, as gamers with a finite budget but still looking for hardware with decent performance, I don't really have the luxury of having to step over my disposable income on the way out of the Mole-Cave, I can't think of a better reason to go AMD, personally 🤷‍♂️
 
Last edited:
It is nice, no denying it...but I'm sure AMD will come up with something similar with FSR when they get around to it. 🤷‍♂️

They already have, FSR 3 comes with frame interpolation, it can in theory render mutliple intervening frames in some situations, so just a matter of waiting until it's ready for release. It will, however, be AMD card only tech, like DLSS for Nvidia.
 
and look at this a train station with a train, so how does that work and will it be in the game??

I remember reading in the SC thread a while back that Star Field will use trains to get you around the larger cities... looks like a mag-rail setup to me.
A guy from IGN was able to play Starfield for one hour, he reported that New Atlantis, the largest city, had a train system that connects the various districts.
Bethesda already said that New Atlantis is the biggest city they ever built, sounds pretty cool to me.
 
Steam survey for May shows first AMD GPU not even reached Top 20. It's all Nvidia.
Don't think AMD has a future with out FSR 3.0. Nvidia is just so far ahead, and this is just the start.

NVIDIA is hugely more popular, but that has much more to do with marketing and mindshare than feature sets. Most consumers don't really know anything and will buy on popularity alone.

Can't wait to see what 50 series Nvidia card will bring next year.

Not next year. 2025.

NVIDIA is back on a 30-month release cadence and their gaming division will be an afterthought as long as they can make an order of magnitude more money per wafer ordered with HPC/AI products. It costs maybe three times as much to make an H100 as it does an RTX 4090 (their highest margin consumer part), but the H100 (36-40k USD) sells for twenty times the price of an RTX 4090 (1600-2000 USD).

AMD is in a similar situation. They're likely going to ride RDNA3, without any major refreshes, for another year while they focus on the CPU and HPC side of business.

Also AMD blocking DLSS is just sad, but a good reason to switch to Nvidia.

Not sure how DLSS not being in a particular title would incentivize this.

It consumes sooo much less power, so quite and i swear it run 10 degree cooler than 30 series card, while having massive performance boost with DLSS 3.

Lovelace is the most efficient GPU architecture currently, but it's essentially a given that a newer architecture is more efficient than any past ones.

The performance boost, die flavor to die flavor (e.g. GA102 to AD102) is also enormous, but that applies even without frame generation and I wouldn't base a purchase around frame generation as it's still quite niche.

I've always been for the Gforce cards, however after a lot of research my next mid GPU will be AMD, because it is cheaper, and in many case without RT performing better regarding raw FPS, I really don't care about RT as many of the games I play it doesn't matter, CPF (cost per frame) is what I'm looking at. FSR or DLSS are nice features but not what I'm looking for when I'm building a gaming PC.

I went over to AMD from Nvidia strictly for the bang for buck reason...If I wanted the equivalent performance to my $600 RX 6950xt from an Nvidia card, I'd be 2 grand poorer and be saddled with a 420W power sucking house brick of a 3090ti that has a heat signature visible from space...

A few years ago, due to supply issues, one was best served by getting whatever they could get their hands on, if they needed something at that moment. However, today, RDNA2 parts are the clear winners in value from about 250-600 USD. Then up to about 900 USD, RDNA3 is frequently more compelling than Ampere or Lovelace. There is some argument for the 4070, but below that NVIDIAs biggest advantages matter less and less because even NVIDIA parts in those segments are too slow to handle heavy ray tracing. Once you turn ray tracing off, you no longer need frame generation, and at that point all the AMD cards are better values. Oddly enough, only the RTX 4070 Ti, 4080 and especially RTX 4090, reach the point where I might consider them to be 'good value', despite their high costs, because only they can do meaningful gaming things that categorically cannot be done on cheaper parts from AMD (or Intel). Below the 200-250 dollar range, no new parts look particularly compelling, and I recommend second hand stuff. If someone is about to spend 250-3

As for power RDNA2 is more power efficient than Ampere, but the difference is not large and it's not hard to cap a GA102 part to a reasonable power level and not lose much, if any performance. Both are significantly less power efficient than Lovelace or RDNA3, and RDNA3 falls notably short of Lovelace on performance per watt.

Personally, my power budgets are generally limited only by my cooling. When they were my flagship parts I ran both my 6800 XT and 6900 XT at between 400-450w power limits, because that's the amount of heat I could comfortably remove from them without having the system sound like a tornado. I used 600w on my RTX 4090 (because the cooler was enormous) when it was on air, and I use 666w (the maximum the firmware I like allows for), with considerably lower noise levels, now that I'm water cooling it with three radiators. However, I could easily run it at 300w or even a bit less, and lose less than 20% of it's peak performance. No reason for me to though, my new heat pump is pretty good at keeping my house cool and my cooling and PSU handle the 666w limit. Probably wouldn't go higher, as I'm finally starting to brush up against actual hardware limits.

Was playing a game right now. Went in to a City, GPU load went to 90%. Turned on Frame Gen, back to 50%. A miracle software and hardware combination came together to give the best gaming experience to date.

Real-time frame generation in gaming is progress, but having GPU utilization fall by half when rendering half the frames is hardly miraculous.

Graphic card market right now is very unstable. GPU prices specially AMD drops from month to month. If you wait a year you can buy same card for 40% off if not more.

I think prices still have some room to move down, but not much. Prices will level off soon. It's getting to the point where old 3070/80/90 series stock is starting to dry up, where the mainstream cards can't really get any cheaper due to aggressively priced second hand parts, and there is little incentive to reduce costs further at the high-end. There is also no new generation of parts from AMD or NVIDIA for at least a year in all likelihood, and the NVIDIA refresh is probably just going to push the top-end even higher.

If one could use a card now, waiting a year probably isn't going to save them that much.

Bethesda already said that New Atlantis is the biggest city they ever built

I suspect New Atlantis will be impressive, but this statement doesn't say much. None of the games they've released in the last 25 years have any cities in them that are larger than small towns, unless they were largely empty ruins. Even if we go all the way back to Daggerfall or Arena, we have moderately sized walled towns standing in for national capitals. New Atlantis being the biggest city they've ever built is pretty much mandatory for it to even seem like a city.
 
I haven't dabbled in computer parts in a decade and have no idea what you guys are talking about.
Lets say I'm about to build a computer for Starfield and went with an i5-12400F, what's a good GPU to match it to?
 
Lets say I'm about to build a computer for Starfield and went with an i5-12400F, what's a good GPU to match it to?

Practically speaking, an i5-12400F is still fast enough to credibly drive almost anything, so GPU comes entirely down to your budgets (price, power, and physical space).

I've already mentioned the RX 6800 XT, but I'll mention it again because there is nothing that can beat it in it's price segment and it's the faster of the GPUs on the 'recommended' system specification for the game.

At the next step up in price, it's pretty much a toss up between the RX 6950 XT and the RTX 4070. The former has more VRAM, but the latter is a fair bit more efficient and 12GiB is likely still enough.

Spend a bit more than that and you're looking at the RX 7900 XT vs the RTX 4070 Ti, which are also reasonably well matched, but at this level the VRAM differential is a bit more meaningful and the efficiency gap not quite as wide.

After this point, there are three GPUs that stand alone in three separate price segments. Nothing beats the 7900 XTX, the RTX 4080, or RTX 4090 where they are priced. For Starfield specifically, the 7900 XTX is almost certainly a better deal than an RTX 4080 because they are quite close in performance where heavy ray tracing or frame generation aren't factors. At the top of the pile, the RTX 4090 will beat anything else, probably by a wide margin at higher settings, but we're now more than three times the price of an RX 6800 XT.

Edit: There will also be an RX 7800 XT releasing some time in the not so distant future, quite probably before Starfield. Unless it's priced stupidly, it will probably knock the 6950 XT out of the running, or force another modest price cut to the remaining 6700/6800/6900 series parts.
 
Last edited:
Yeah, I am looking to upgrade my old laptop (6+ years old, GTX 1080) for both Starfield and ED.

What I don't understand is that the RTX 4080 on paper looks 'worse' than the 3080 Ti - although looks like it uses less power.

Some of the choices I have are basically Nvidia 3070, 3080 Ti and 4070...

Any suggestions/advice?

PS No budget - work pays for the laptop...
 
Practically speaking, an i5-12400F is still fast enough to credibly drive almost anything, so GPU comes entirely down to your budgets (price, power, and physical space).

I've already mentioned the RX 6800 XT, but I'll mention it again because there is nothing that can beat it in it's price segment and it's the faster of the GPUs on the 'recommended' system specification for the game.

At the next step up in price, it's pretty much a toss up between the RX 6950 XT and the RTX 4070. The former has more VRAM, but the latter is a fair bit more efficient and 12GiB is likely still enough.

Spend a bit more than that and you're looking at the RX 7900 XT vs the RTX 4070 Ti, which are also reasonably well matched, but at this level the VRAM differential is a bit more meaningful and the efficiency gap not quite as wide.

After this point, there are three GPUs that stand alone in three separate price segments. Nothing beats the 7900 XTX, the RTX 4080, or RTX 4090 where they are priced. For Starfield specifically, the 7900 XTX is almost certainly a better deal than an RTX 4080 because they are quite close in performance where heavy ray tracing or frame generation aren't factors. At the top of the pile, the RTX 4090 will beat anything else, probably by a wide margin at higher settings, but we're now more than three times the price of an RX 6800 XT.

Edit: There will also be an RX 7800 XT releasing some time in the not so distant future, quite probably before Starfield. Unless it's priced stupidly, it will probably knock the 6950 XT out of the running, or force another modest price cut to the remaining 6700/6800/6900 series parts.
If I went with RTX 4070, the CPU isn't going to be a bottleneck? I was under the impression i5-12400F was kind of mid-range while RTX 4070 and up are pretty high end.

I hadn't kept up with AMD numbering. My last computer that uses AMD was over ten yesrs ago. I have fond memories of not having to turn the heating on when I played games in the winter and was still sweating.
 
Back
Top Bottom