Most likely yes.
My 4790 isn't getting 100% usage, but we seems to have some people in this thread who are, and people who are upgrading to things like the 8700K to get over CPU issues with such processors.
Personally I cannot get my 4790K over 50% usage even in VR Ultra detail. The only issue I can hit is my GPU exceeding 100% usage and therefore reducing from 90fps to 45fps, but even then I don't witness any real significant issue while using it (I didn't even notice it had happened).
So TBH I'm confused about this whole thread hence me doing testing and working out what is going on. The only thing I have seemingly seen and need to test further is excessive GPU use by ships in an instance. eg: I can't understand why an Imperial Clipper turning up and being rendered as a dot out of sight is possibly using 5-10% of my GPU!? Imagine what happens then when half a dozen+ ships are around... even as distant blips.. (Are there rendering issues in ED such that there's large GPU usage for no real reason?)
I think you are confusing two different issues. For you your GPU is the bottleneck and your CPU isn't going to cause you any issues. Plus as soon as you drop to 45FPS there is less demand on your CPU anyway because it has to make half as many draw calls.
Your i7 4790K is a stronger CPU than my i5 4670K but which is paired with a GTX 1080 and for me the CPU sometimes just can't drive my GPU fast enough. Others here have found the same sort of issue trying to drive a 1080ti with an i7 4790K so have been waiting for a significant enough jump in the i7 to justify the upgrade. For me it will be a massive leap, 50% more cores and 200% more threads.
No entirely sure..
I quote from earlier in the post (from someone with a 4790K) - "inexplicably get 100% CPU use from time to time in ED while in VR."
Maybe there is some sort of link between even higher speed VR rendering and higher CPU usage (as you'd expect), but I'm bemused how higher GPU usage could seemingly double CPU usage (on a 4790K)? - I can't get mine about 50% (ultra settings)....
If you look at my results, even when my GPU is maxed out (in VR Ultra detail) and the FPS reduces from 90fps to 45fps, the CPU usage is hardly affected at all.
Here's me blowing up the Imperial Clipper that seemed to mean my GPU couldn't cope, and then on its destruction my GPU could so returned from 45fps to 90fps. You can see the CPU usage is notionally affected by double my VR (ultra) rendering. Some cores averaging at around 50% previously from around 40% maybe?
https://i.imgur.com/afOWKN7.jpg
Now I'm not claiming people aren't encounting CPU issues, but I'm don't think it's as black and white as VR on high end GPUs is causing that (on for example i4790Ks)? It's almost like there's some curve ball involved, maybe some perfect storm of events and then some sort of code issue maybe?
No entirely sure..
I quote from earlier in the post (from someone with a 4790K) - "inexplicably get 100% CPU use from time to time in ED while in VR."
Maybe there is some sort of link between even higher speed VR rendering and higher CPU usage (as you'd expect), but I'm bemused how higher GPU usage could seemingly double CPU usage (on a 4790K)? - I can't get mine about 50% (ultra settings)....
If you look at my results, even when my GPU is maxed out (in VR Ultra detail) and the FPS reduces from 90fps to 45fps, the CPU usage is hardly affected at all.
Here's me blowing up the Imperial Clipper that seemed to mean my GPU couldn't cope, and then on its destruction my GPU could so returned from 45fps to 90fps. You can see the CPU usage is notionally affected by double my VR (ultra) rendering. Some cores averaging at around 50% previously from around 40% maybe?
https://i.imgur.com/afOWKN7.jpg
Now I'm not claiming people aren't encounting CPU issues, but I'm don't think it's as black and white as VR on high end GPUs is causing that (on for example i4790Ks)? It's almost like there's some curve ball involved, maybe some perfect storm of events and then some sort of code issue maybe?
Probably...I wondered if you were paraphrasing something I said because I'd not seen anyone else say 100% except me and I'm not using an i7. Also I didn't mean ED was responsible for all of it but it was very high (causing most of it) and 100% overall.
Again. I need to repeat myself.
The 1070 is comparative to the 980ti In ED.
My 4790k was not the trigger for ASW when I had the 980ti.
This only became prevalent when I upgraded it to the 1080ti.
1070 will be overloaded BEFORE the cpu triggers spacewarp.
So your experience is pretty much the same as mine was before I upgraded GPU.
You will NEVER see what I saw with CPU throttling until you get a more capable GPU.
An upgrade I would not recommend until you also can get a DDR4 capable mobo, RAM and CPU.
My i5 4670k was definitely hitting 99% in HAZ res's.
resulting in not even being able to maintain the 45fps mark for spacewarp.
It would drop as low as 30fps in some cases, even when I disabled spacewarp entirely.
No entirely sure..
I quote from earlier in the post (from someone with a 4790K) - "inexplicably get 100% CPU use from time to time in ED while in VR."
Maybe there is some sort of link between even higher speed VR rendering and higher CPU usage (as you'd expect), but I'm bemused how higher GPU usage could seemingly double CPU usage (on a 4790K)? - I can't get mine about 50% (ultra settings)....
If you look at my results, even when my GPU is maxed out (in VR Ultra detail) and the FPS reduces from 90fps to 45fps, the CPU usage is hardly affected at all.
Here's me blowing up the Imperial Clipper that seemed to mean my GPU couldn't cope, and then on its destruction my GPU could so returned from 45fps to 90fps. You can see the CPU usage is notionally affected by double my VR (ultra) rendering. Some cores averaging at around 50% previously from around 40% maybe?
https://i.imgur.com/afOWKN7.jpg
Now I'm not claiming people aren't encounting CPU issues, but I'm don't think it's as black and white as VR on high end GPUs is causing that (on for example i4790Ks)? It's almost like there's some curve ball involved, maybe some perfect storm of events and then some sort of code issue maybe?
If we take my HazRes examples. I'm generally running at 90fps in VR at Ultra. Then when a few too many ships seemingly turn up I click over 100% usage to say around 100-110%, so the OR says OK, let's instead run at 45fps which means about 55% GPU usage. And this has minimal effects on my CPU usage sitting at around 40-50% usage. And you can see this on my charts.
What you're proposing is, if I'd had a superior graphics card, which had done that work at say 90% instead of 110% (thus remaining at 90fps), my CPU usage could have (more than) doubled?
That sounds odd to me! ie: I don't see how my GPU doing 10% more work would mean my CPU doing 100% more work?
Again, I'm not saying people are not having problems, but I'd love to see some performance charts of CPU and GPU usage of someone hitting this.
There is no way your 4790k should be bottlenecking even an 1080ti at the moment. While it may have slightly less FPS then having an i7 8700k it shouldn't be that noticeable. In reality the higher the res the less of a difference you will see as the GPU becomes the bottleneck.
I am 100% sure my CPU is bottlenecking my GPU as it only an i5. Swtiching to an i7 2600k and overclocking it to about 4.5ghz should stop any bottleneck even up to and 1080ti.
There is very little difference in running 4k on an overclocked i7 2600k a Ryzen 1600 and an i7 8700k with a 1080ti. There will be a few FPS between them at most.
I think you are correct that this isn't a black an white issue for sure. I was planning on upgrading to the 8700k from my old 3770k not because of any specific issue but because it's been a while since I built a rig. But at the same time I don't like to waste money so I wanted to make sure I was going to see tangible benefits with the upgrade. I'm running a 1080ti so no upgrade room on the GPU side. I only use my PC for gaming so I did a lot of research and base on everything I found the cpu upgrade wasn't going to give me much of a benefit as the load will ultimately shift to the GPU. What I did learn is that the new CPU helps a lot if you are running a high HZ monitor at 1080P, but for VR or 4K it doesn't seem to make as much difference. Also CPU clock speed still seems to make a big difference. A stock 4770k vs a stock 8700k with see a bigger difference that an overclocked 4770k at 4.5 Ghz. Memory speed seems to make a difference too so if you are running faster DD3 ram it will help. Unfortunately there aren't a lot of measured VR benchmarks out to look at but the ones that I found where they actually an used FCAT capture card to accurately measure FPS showed a lot of CPU's were able to maintain the 90 FPS with a 1080ti. Like you I have never been able to get my CPU over 60% in ED but it is clocked at 4.5 GHZ with DDR3 2000 RAM. It does seem like ED likes the 8 threaded i7's over the 4 threaded i5 though. The 8700k is an awesome CPU but it just doesn't seem like it will give me a measurable improvement over my current setup. The one thing in common over all these systems from Ivy bridge to Coffee Lake is they are all still communicating over PCI Express 3. I've decided to wait until PCI Express 4 systems start shipping before I upgrade. The standard has been finalized so there is a chance it will start showing up next year.
I'm seeing tangible benefits in all games.
Not just ED.
but even regular 2d games like doom, space engineers loads faster "significantly" faster for doom.
And runs far smoother.
To me it has been greater upgrade than I expected it to be.
I upgraded from rx580 nitro to 1080Ti and i5 4590 to 8600k at 4.8ghz on separate occasions. The improvement in fps not dropping was much more noticeable when I changed the cpu. The gpu made a small difference. If I had the money back I would actually go 8700k and a 1080 oc. That would be a cheaper setup and I bet you couldn't tell the difference.
Running 2.0 on hmd? Ye no doubt. On slightly lower settings 1.5 or 1.75 would you be able to see a difference without an fps counter or core monitor?Well, I am throttling on the 1080ti now, so yeah I'm sure you could tell the difference if you really wanted too![]()
Running 2.0 on hmd? Ye no doubt. On slightly lower settings 1.5 or 1.75 would you be able to see a difference without an fps counter or core monitor?
Not quite, certain settings dialled up to ultra seems to do the trick.
But unlike with CPU demand we can also dial GPU centric settings down, so we can reduce the demands.
Although networking really seem to be a big one still, get a wingman in an instance and I'm still loosing 30%+ headroom.
Seriously FD needs to sort it.
And before you ask, I'm on a 300/300 FIOS connection wired to the router.
I also backed the pimax, so hoping to run that 'ok'.
And that pushes as many pixels as the CV1 with a 1.7 SS just to reach native.
Networking is the killer, I remember we did a charity event putting a rake of players SRV jumping over cutters (kind of like evil knievel with his motorbike over trucks) and we had a dozen CMDDR's instanced on a planetary starport, and my framerate in VR was in the teens... As we kept adding players to that instance I had to keep throttling back settings, in the end up it was horrible in VR.