I am not a graphics programmer, but I do have software/hardware training and I do inform myself regularly on the technical aspects of PCs.
Certain specific calculations have progressively migrated from CPU to GPU for many years now. And that amount has just increased in recent years, with higher rate of innovations on 3D graphics engines (the mathematical functions and its implemented algorithms), paired with the parallel evolution of processing units in modern GPUs.
From what I have read and observed, in a game, a heavy load on the CPU and a low load on the GPU, at high resolutions and with max settings, can be a consequence of an extremely high frame rate on a undemanding game engine (Source 2) or, a lot of the times, an unoptimized engine where many new GPU hardware functions have not been programmed on the game engine because of some of the following:
1 - The engine's inability to be modified to include that code;
2 - The Studio's unwillingness to modify the engine due to permanent ongoing projects.
I have no idea in what category Bethesda is included. Maybe both?