It would seem to me that ED should be more CPU than GPU intensive. There's not a lot of stuff, effects processing, going on in Elite Dangerous. Graphically, it's pretty but it's not cutting edge stuff. There should be a lot more going on with BGS and thus CPU, but maybe that can be off-loaded to the GPU now. I am FAR from a computer scientist, or even a game programmer or any programmer beyond some basic Basic and Power Shell. Ha!
I'm not a programmer or game developer, but I know a few things about hardware and I can make observations about what applications are doing with it.
Most CPU intensive games are either limited by one or more render threads that prepare batches of draw calls for the graphics driver; AI workloads; or physics. ED is evidently
modestly well-threaded, but it's render threads are not particularly demanding, probably
because of it's relatively modest graphics design. Likewise, it's AI and physics are pretty modest compared to many other titles. FDev also seems to have made sure that things are done in compute shaders, on the GPU, where practical.
Somewhat counter intuitively, ED is
more demanding (note the power limit...my card allows me to set a cap at 150%of TDP, most games only reach around 100-120 at my normal OC, but ED averages about 125 and can spike past 140) on most GPUs than virtually any other title I've run (with exception of the equally counter intuitive
Path of Exile, which, with global illumination enabled crashes into the power limit of many modern cards at astonishingly low clocks).
CPU demand is also largely fixed, but GPU load will scale with resolution and details. Even at low settings/resolutions ED tends to not be CPU limited until extreme frame rates on any quick processor. Cranking up the resolution just means that the GPU limited frame rate falls, and CPU utilization along with it, as the render threads have to wait for the GPU.
Someone in another thread expressed skepticism about ED really being so GPU limited, despite the GPU clearly being pegged at 99-100% load while the most demanding CPU thread never got above 50-60%. So, I did this test in the SRV training scenario, with my standard gameplay settings (4k resolution, custom ultra settings), at two different CPU speeds:
I set my desktop res to 5k with DSR so I could fit a 4k window on screen with CPU-Z and GPU-Z. Then I fired up ED, started the SRV training scenario, and allowed it to fully load before taking a screenshot.
The first image is with the CPU at my 24/7 4.2GHz clock. The second image is with everything identical, except that I subtracted ten multipliers from the CPU core (making it 1000MHz slower).
That single FPS difference is misleading...average difference was significantly less, maybe 0.2 fps. from a ~23% reduction in CPU clock. Gameplay is totally indistinguishable at even lower CPU clocks.
As for the BGS, that's mostly a database, and one that isn't even run locally. Your game client just reports what it does and looks up the BGS state from Frontier's servers.