Whole idea behind this setup was to achieve acceptable fps, some portability (that actually was my past work requirement where I had to be able to pack and move my PC to workplace and I was using self made crude suit-case), and most importantly power consumption around 100W total.
Get a mid-range laptop. They are heavily integrated systems where all major components are run close to their performance per watt sweet spots, out of the necessity of cooling, portability, and battery life.
Now when I read this post and see configurations proposed I can't help and think: What's the Wattage for such gaming PC? Is it half of microwave oven or more?
An eight-core Zen 4 or Zen 5 plus an upper-mid range desktop GPU implies a system with about 400-450w at the wall in normal gaming loads without extensive tuning.
If one is willing to buy and tune parts purely around efficiency you can do ~300w at the wall. Anything much beyond that ballpark and you'll be losing frames-per-watt faster.
Personally, my electricity is cheap and I only need to cool my home during the hottest 2-3 months of the year. I can run my primary gaming system at ~400w during the summer, or get ~20% more performance if I let it scale to 700-800w.
Heat generation, of course, is almost exactly the same as total power consumption, minus trivial losses as sound or EMI that aren't absorbed by the immediate vicinity.
What's the point of playing in 1440p resolution (unless we talk about really large monitors or even TV screens)?
I haven't used a display smaller than 32" as my primary in a decade at this point and I barely look at anything smaller. I can just barely resolve individual pixels on a 32" 4K (3840*2160) display if I'm sitting at about arm's length (36" or so), without straining. I don't stop noticing improvements in native resolution until
significantly past this.
Personally, I find 1080p entirely comfortable on most laptop-sized displays, but I can still see sharpness improve at higher resolutions.
Anyway, this stuff has been studied to death and there are broadly accepted SMPTE standards and recommendations derived from them:
Our TV Sizes to Distance Calculator helps you choose the right size TV for your space. The optimal viewing distance is about 1.6 times the diagonal length of the television. For example, for a 55” TV, the best distance is 7 feet.
www.rtings.com
In case someone is wondering how gameplay on my PC looks like I can tell it is sufficient to run Odyssey with comfortable fps. You can also use it as reference for minimum hardware requirements to play odyssey in "low resolution" of around 1360x768 (which is really fine on 15" monitor imho).
Odyssey allows for a whole lot of tuning of GPU-related performance parameters and very little tuning of the CPU dependent performance parameters. One can get acceptable performance (barely, in certain surface settlement scenarios) if one's CPU is sufficiently fast, almost irrespective of what GPU is being used, within reason. Your i3 is not a bad chip for ED all things considered, but would still likely have issues in large settlement surface combat zones.
Of course, if people have image quality goals, then the sky is the limit on GPU. I'm trying to brute-force my way through the lack of temporal AA, and pick up whatever low hanging graphical fruit (texture and shadow map resolution, mostly) is available in the process. Unfortunately, even a water cooled RTX 4090 allowed to pull more than 600w by itself doesn't quite get to the internal resolution I'd need to make those jagged specular highlights unobtrusive. Maybe in another couple of product cycles.