Jump to content
Dante Unbound: Known Issues ×
Dante Unbound: Share Bug Reports and Feedback Here! ×

High CPU usage, low GPU usage [bad optimization]


ngrazer
 Share

Recommended Posts

Seriously, what the hell is wrong with Warframe optimization?

My specs are i5-4460, GTX 1060 6Gb, 4x4Gb RAM.

Look at these:

Spoiler

2p2uuxs.png

Spoiler

CkjinKg.png

Spoiler

GbpqIOA.png

Spoiler

MGFewjO.png

 

This is just ridiculous. Other games work just fine and I can't recall a single title where CPU usage would've hit even the 80% exceeded GPU usage THIS far. I did a quick Google search and turns out there are dozens (if not hundreds) of reports of similar problem.

Edited by ngrazer
Link to comment
Share on other sites

20 hours ago, kai661xbox said:

Run malware bytes. I had the same problem. There was a Yahoo malware on Google chrome. Denver it runs in the background it messes with some thing.

Not my case.

Examples from the screenshots were captured without any background-running programs.

For the sake of comparison - here, a couple of screenshots from Wolfenstein 2 with max settings:

Spoiler

tYO8hBF.png

Spoiler

9Qfu7AG.png

And lastly - the one where CPU usage is (theoretically) supposed to be higher than normal thanks to all these debris and their physics calcs.

Spoiler

vxG5bNW.png

 

But just in case I've ran some different anti-malware utilities aaaand nothing.

Link to comment
Share on other sites

On 11/15/2017 at 10:30 PM, ngrazer said:

Seriously, what the hell is wrong with Warframe optimization?

My specs are i5-4460, GTX 1060 6Gb, 4x4Gb RAM.

This is just ridiculous. Other games work just fine and I can't recall a single title where CPU usage would've hit even the 80%. I did a quick Google search and turns out there are dozens (if not hundreds) of reports of similar problem.

Warframe is CPU bound, not GPU bound (hence the higher CPU usage).

Run through Novigrad City and, assuming your GPU is powerful enough, you will peg your CPU at 100% (never mind 80%). 

Granted, Warframe could do with some tidying up (if steve's tweets are anything to go by), but favouring the CPU is not a sign of bad optimisation. It's just how the game simulation is run. Warframe's thing is AI (which is an entirely CPU side job for every game engine). Compared to a lot of titles, Warframe tends to have lots of AI, with lots of "unique" (for want of a better word) behaviours, vs other games which have few behaviours across few AI units.

Unity, and Cryenigine titles for example tend to lean a bit more on the CPU side than say a frostbite engine title, which leans heavily on GPU workload (see here for many of the issues that arise when you focus in this direction) - Disclaimer: the ultimate factor is determining which way it leans is determined by what the games developers wants to accomplish.

"Bad" optimisation is something like 'Arkham Knight', where CPU and GPU usage are all over the shop, or both just outright underutilised. 

Link to comment
Share on other sites

On 20.11.2017 at 6:58 AM, MillbrookWest said:

-snip-

Sorry for the confusion, edited the line:

On 15.11.2017 at 12:30 PM, ngrazer said:

Other games work just fine and I can't recall a single title where CPU usage would've hit even the 80% exceeded GPU usage THIS far.

Batman Arkham Knight:

Spoiler

6MvhxBt.png

03HkJQ2.png

PtSE8GN.png

x2ygaN1.png

iFk3b25.png

MZsMCcI.png

8R5rAHz.png

YxI8SJa.png

 

Mass Effect Andromeda:

Spoiler

JdNoKb5.png

VLsbKpA.png

DRneEqA.png

XXXndLr.png

JgJFac1.png

E4I3k1y.png

 

Witcher 3:

Spoiler

r326hPb.png

Olxcap8.png

1aVa2PI.png

J2aM6ss.png

78Xpcct.png

QxSys9w.png

pR94qES.png

E8cTZ0C.png

RNgXhMM.png

vbjwxF5.png

Pc6OtsT.png

s7u077w.png

rr81lUL.png

 

And a couple of non-AAA titles.

 

Satellite Reign:

Spoiler

2GUoE30.png

 

Yesterday Origins:

Spoiler

0ovDMDu.png

 

See the pattern? GPU usage is either higher thank CPU or more or less in line with it. I can test other games of course, but unless the game is simplified visually and has lots of physics in it I doubt the results are going to be any different from these. 

Then look at Warframe screenshots.

During an Eidolon fight GPU/CPU = 1/2. Never mind the other three where GPU/CPU = 1/4.

 

18 hours ago, kai661xbox said:

Adjust Nvidia 3d controll panel setting 

Adjust what exactly? Elaborate please.

 

17 hours ago, White_Matter said:

It is true that the game is CPU heavy, but it is also badly optimized. Doesn't use all cores. 

There is no way around it. You can try overclocking your CPu and capping frames or something, but you can't fully prevent CPU hogging.

Well in my case it DOES use all 4 cores but... I can't even explain how (and why) the game with this amount of models/objects/textures/shaders uses less GPU power (and way more CPU power) than 2.5D point-and-click quest (Yesterday Origins) made with Unity.  

I wish I could've OC my CPU (the safe way, without gambling with voltage) but it's locked.

Link to comment
Share on other sites

1 hour ago, ngrazer said:

See the pattern? GPU usage is either higher thank CPU or more or less in line with it. I can test other games of course, but unless the game is simplified visually and has lots of physics in it I doubt the results are going to be any different from these. 

Then look at Warframe screenshots.

During an Eidolon fight GPU/CPU = 1/2. Never mind the other three where GPU/CPU = 1/4.

But TW3 and Andromeda behave exactly as i said they would - One through the middle of Novigrad, and the other on the Frostbite engine.

GPU usage goes up and down based on the work sent to it from the CPU - If the CPU can't send work to the GPU, the GPU can't do work. If you want to tip the scale just set the resolution to 1440p(or over) and you'll always be GPU bound since pixel crunching is entirely GPU bound.

Warframe is not on the same level visually as the likes of Battlefront (for example, another Frostbite game). Where so much work is being sent to the GPU the game code almost doesn't matter - you will always be limited by the renderer (or GPU bound). The result of this is that Warframe looks 'good', but Battlefront looks 'stunning'. On the flip side, Battlefront doesn't have a 4 story tall enemy with three heads, each with their own unique attack patterns.

Most modern titles take this approach in part due to the consoles. Consoles have many cpu cores to send lots of work to the GPU, but the cores themselves are pretty weak.

If you want to see this in other games, set them to their highest setting, and turn the resolution way down 800x600 or the like. Assuming there is actualy complex CPU work to run (origins above does not have for example) you'll have low GPU usage whereas your CPU usage wont change at all (also use some sort of frame limiter - vsync, adaptive sync etc).

EDIT:

Spoiler

Some low resolution:
batmanak_2017_11_22_2knz74.jpg

720p
batmanak_2017_11_22_2fzzkr.jpg

1440p
batmanak_2017_11_22_252bd2.jpg

 

GPU line read: Temps / Core Load / Fan Speed / Core Clock 

Notice that between all 3, CPU usage doesn't change, but the GPU change is quite massive. This is a bit of a bad example but gives an idea on how GPU load is relative to what you are running on it.

Edited by MillbrookWest
Link to comment
Share on other sites

1 hour ago, MillbrookWest said:

But TW3 and Andromeda behave exactly as i said they would - One through the middle of Novigrad, and the other on the Frostbite engine.

They are. Yet for Witcher 3 there are more characters on the screen in Novigrad on any screenshot than on all of my four Warframe screenshots combined.

 

1 hour ago, MillbrookWest said:

If the CPU can't send work to the GPU, the GPU can't do work.

But that's exactly what this topic is all about - bad optimization of CPU usage. My CPU is more than fine for my GPU (which can be proved in any other game), yet Warframe doesn't think so.

Warframe doesn't have stunning physics model with lots of destructible objects (Crysis for example) or hordes - I mean, hordes, not dozens - of enemies on the screen (like Total War series for example). And I don't see how 'attack patterns' of enemies in Warframe are any different than 'attack patterns' of enemies in Witcher/Battlefront/any other game.

For example, Overwatch. Even more 'attack patterns' per unit and the graphics aren't any better than in Warframe yet the GPU/CPU pattern is okay.

Killing Floor 2. The variety of 'attack patterns' PLUS lots of enemies - same pattern.

Total War Warhammer - more or less the same level of graphics, yet there HORDES of enemies (with their own 'attack patterns' btw) - same.

Left 4 Dead 2? CSGO? Destiny 2? Crysis? Shadow Warrior 2? Battlefield 3? DOOM? WoT? Fallout 4? Bioshock 3? Dishonored 2? Prey? This list can go on, like, forever.

My CPU starts bottlenecking the GPU only in games like GTA5/Watch Dogs 2 (where there is a vast open world and TONS of real-life-like AIs) or in PvP matches with 64 players (Battlefield 1). Warframe doesn't look like any of these so yeah, there is something terribly wrong with its optimization.

 

Edited by ngrazer
Link to comment
Share on other sites

12 hours ago, ngrazer said:

 

Well in my case it DOES use all 4 cores but... I can't even explain how (and why) the game with this amount of models/objects/textures/shaders uses less GPU power (and way more CPU power) than 2.5D point-and-click quest (Yesterday Origins) made with Unity.  

I wish I could've OC my CPU (the safe way, without gambling with voltage) but it's locked.

 

Interesting. In my case it doesn't. Heavy on 1 core 2 threads and the rest are like on idle. So the reason why it doesn't use GPU power is because CPU can't feed enough frames to the GPU. When the game is running @144 fps on max detail, GPU usage is close to 99%. But when things start to go down(lots of enemies 4 people using their abilties and weps especially on POE) CPU hogging starts and my GPU usage drops down to 50-60%.

I have 7700K @4.6 ghz 2666mhz 32gb ddr4 ram and 1060 6gb. 

Link to comment
Share on other sites

If you think you have a specific issue with perf, you could submit your EE.log in a support ticket. Find wherever it is that you think there is an issue and leave a comment in chat for the Devs to find the location in the log.

Link to support: https://digitalextremes.zendesk.com/hc/en-us

17 hours ago, ngrazer said:

They are. Yet for Witcher 3 there are more characters on the screen in Novigrad on any screenshot than on all of my four Warframe screenshots combined.

^On the screen is GPU work. The reason your CPU load spikes in Novigrad is because there is AI all around you, not just the ones you can see, but the ones you pass, and the ones the game is currently populating as you move about.

You can cull the mesh from render (the gpu) as soon as it moves off screen (or goes behind an object), this means that the only AI your GPU renders are specifically the ones on screen. But simulation of that AI (cpu) does not stop, depending on the sort of simulation, CPU work continues regardless(even off screen) - think oblivion with the NPC's that walked between towns, your GPU never renders them doing so until you physically get close to them, however, regardless of where you are, the CPU will still have that NPC simulation running.

This is the same with Warframe, just because it's not on screen, does not mean its been forgotten about - while the GPU may have culled the mesh a long time ago, the CPU is still running whatever the simulation is - if you've seen Steve's tweets (which i think were deleted) the code could do with a bit of optimising, but you can only do so much with game logic (refer to my post above). NOTE: I don't know what C++ is like, but if it's similar to C#, its usually working the logic in a different way, vs. redefining the way your logic executes in it's entirety (which as Steve said would kill Warframe).

17 hours ago, ngrazer said:

Warframe doesn't have stunning physics model with lots of destructible objects (Crysis for example) or hordes - I mean, hordes, not dozens - of enemies on the screen (like Total War series for example). And I don't see how 'attack patterns' of enemies in Warframe are any different than 'attack patterns' of enemies in Witcher/Battlefront/any other game.

Nor does it compare to the visuals in Battlefront (for example) - see my post above, it's less graphically intense so why does the GPU need to work harder? You posted the Origin screen shot - i don't think you noticed that your GPU also switched into a lower power state - 1.4GHz vs 2.0GHz. Warframe is doing the same in a bunch of your screenshots (as low as 1GHz) -  see my note in my edit in my last post.

'Destructible objects' are more GPU bound since we're talking draw calls, meshes, effects etc. (modern API's like Vulkan/DX12 have relatively little issues with these. Older ones are a bit iffy). Total war works around the AI issue in a fairly predictable way - there is a reason the unit stacks get blobbed together in one giant unit, vs an entire army of 1000+ unique soldiers which you can move about individually. 

I touched on a lot of the points brought up in my posts above.

 

Edited by MillbrookWest
Link to comment
Share on other sites

Try running 16gb ram. S#&$ if I had 8 which I'm not buying anything not because I'm poor but I'm cheap and got it for free I could play GTAv with out all the studders. I used to have a computer with a lot of ram for a cad program. But I noticed a tiny decrease in gaming performance due to the massive amount of RAM. But it sounds like Warframe does not like you. Happy Thanksgiving.

Link to comment
Share on other sites

On 22.11.2017 at 5:45 PM, kai661xbox said:

Look. It's very easy and if it doesn't work then turn it back on. Disable superfetch. Un-park CPU and set a page file min and Max to 8192MB

Cores are unparked already, but I'll try disabling a superfetch. Ed. Turns out it was disabled too.

On 23.11.2017 at 6:57 AM, kai661xbox said:

I have no idea. Mine uses 99%gpu my ram is usually at 3.8gb and my pagefile 4.2gb. with 62% CPU usage. 

What are your specs btw?

 

On 23.11.2017 at 8:03 AM, MillbrookWest said:

If you think you have a specific issue with perf, you could submit your EE.log in a support ticket. Find wherever it is that you think there is an issue and leave a comment in chat for the Devs to find the location in the log.

Thanks for reminding me, almost forgot Support can check performance issues as well.

 

On 23.11.2017 at 8:03 AM, MillbrookWest said:

You can cull the mesh from render (the gpu) as soon as it moves off screen (or goes behind an object), this means that the only AI your GPU renders are specifically the ones on screen. But simulation of that AI (cpu) does not stop, depending on the sort of simulation, CPU work continues regardless(even off screen) - think oblivion with the NPC's that walked between towns, your GPU never renders them doing so until you physically get close to them, however, regardless of where you are, the CPU will still have that NPC simulation running.

This is the same with Warframe, just because it's not on screen, does not mean its been forgotten about - while the GPU may have culled the mesh a long time ago, the CPU is still running whatever the simulation is

Not quiet the same as in Warframe enemies aren't spawned on the whole map from the very beginning of the mission - they will continue on spawning throughout the mission (and they can spawn right behind your back, too, in some cases). However there is a possibility that their AIs are loaded right away as soon as the mission starts and the models come later on but I can't prove/deny that. 

On 23.11.2017 at 8:03 AM, MillbrookWest said:

Nor does it compare to the visuals in Battlefront (for example) - see my post above, it's less graphically intense so why does the GPU need to work harder? You posted the Origin screen shot - i don't think you noticed that your GPU also switched into a lower power state - 1.4GHz vs 2.0GHz. Warframe is doing the same in a bunch of your screenshots (as low as 1GHz) -  see my note in my edit in my last post.

Yup, I know that the lower the resolution the less GPU is used/the more CPU is used.

And I did noticed of course - the GPU adjust its frequency automatically depending of the game behavior.  

And like I said, Left 4 Dead 2 and Shadow Warrior 2 aren't any better than Warframe in terms of graphics, and they also have lots of mobs with their own AIs, yet I've never seen 100% CPU / 50% GPU situations in them.

On 23.11.2017 at 8:03 AM, MillbrookWest said:

'Destructible objects' are more GPU bound since we're talking draw calls, meshes, effects etc.

But before all of that happens the game calculates how much bits there will be and how will they behave (physics-wise) - which is a CPU-bound process.

 

On 23.11.2017 at 12:29 AM, White_Matter said:

Interesting. In my case it doesn't. Heavy on 1 core 2 threads and the rest are like on idle. So the reason why it doesn't use GPU power is because CPU can't feed enough frames to the GPU. When the game is running @144 fps on max detail, GPU usage is close to 99%. But when things start to go down(lots of enemies 4 people using their abilties and weps especially on POE) CPU hogging starts and my GPU usage drops down to 50-60%.

I have 7700K @4.6 ghz 2666mhz 32gb ddr4 ram and 1060 6gb. 

Oh btw, today I was a bit curios if the running the game in single-threaded mode is any different and, well...

Multi-threaded/single-threaded

1. https://i.imgur.com/O9uKPLC.png https://i.imgur.com/2B3Nywp.png

2. https://i.imgur.com/cYTzCx1.png https://i.imgur.com/am8Natk.png

3. https://i.imgur.com/gfba11h.png https://i.imgur.com/qmXDxsL.png

4. https://i.imgur.com/5p8hkDu.png https://i.imgur.com/s27c1La.png

5. https://i.imgur.com/KT4xSDl.png https://i.imgur.com/qAY1Rto.png

The 5th one is pretty interesting in my opinion. 

 

 

Edited by ngrazer
Link to comment
Share on other sites

1 hour ago, ngrazer said:

Yup, I know that the lower the resolution the less GPU is used/the more CPU is used.

Unless a Dev is doing render work on the CPU, which next to no modern dev does (except last-gen with Sony's First Party studio's working on the Cell), resolution has no effect on CPU work load (you can see this in the pics i provided). Changing the in-game settings (model quality, LOD transition, Shadow distance etc.) may have an impact if they relate to CPU preparing work to send to the GPU. 

1 hour ago, ngrazer said:

And like I said, Left 4 Dead 2 and Shadow Warrior 2 aren't any better than Warframe in terms of graphics, and they also have lots of mobs with their own AIs, yet I've never seen 100% CPU / 50% GPU situations in them.

But before all of that happens the game calculates how much bits there will be and how will they behave (physics-wise) - which is a CPU-bound process.

I don't really know what Shadow Warrior 2 is, so i have no idea what the game is like, but if you play L4D2 the way i play[ed] it, i typically supersample the image(with MSAA to boot, a taxing AA solution by today's standard) and run with an unlocked frame rate - both things entirely GPU bound, which results in a high GPU load.

In the case of a mesh destruction in a 'destructible object', this value is typically pre-determined - the effect is typically accomplished by switching out two completely different meshes(which saves on perf). Otherwise it's RNG. The RNG in Warframe is determined from a 'seed' - Warframe uses it's own seed, but most programming languages come with their own pseudo one - In C# for example you can use 'Random rand = new Random();" to give you a random number, this seed as i recall is derived from the system clock every millisecond (so its not very good). It's usually not a super CPU intensive task.

This is the actual result of RNG generation in Warframe:
dPNh8nd.png

You can read more here if you like:

 

 

Edited by MillbrookWest
Link to comment
Share on other sites

1 hour ago, ngrazer said:

Oh btw, today I was a bit curios if the running the game in single-threaded mode is any different and, well...

Multi-threaded/single-threaded

Just to touch on this as the setting is usually misinterpreted.

"Multi-threaded Rendering"

The renderer is being threaded - this is the work being sent to the GPU.

If you disable this, you move the renderer to a single thread (which is what the 5th set shows).

Link to comment
Share on other sites

2 hours ago, MillbrookWest said:

Just to touch on this as the setting is usually misinterpreted.

"Multi-threaded Rendering"

The renderer is being threaded - this is the work being sent to the GPU.

If you disable this, you move the renderer to a single thread (which is what the 5th set shows).

Well, you can clearly tell that just by looking at any two of the screenshots.

What I'm pointing out is that for some reason there are areas in the game where CPU usage with multi-thread enabled is almost 50% higher than with a single-thread one - yet the output is absolutely the same.

Which in my opinion is strange and shouldn't even be possible if the game was optimized for CPUs with multiple cores. 

Ed. OK, I think that maybe - maybe - I've found possible culprit responsible for such CPU/GPU ratio but gotta test things first to see if it's true.

Edited by ngrazer
Link to comment
Share on other sites

On 24.11.2017 at 5:11 PM, ngrazer said:

Ed. OK, I think that maybe - maybe - I've found possible culprit responsible for such CPU/GPU ratio but gotta test things first to see if it's true.

Okay, so I was trying to find more games (among those that I own) that allow you to lock their frame rate via settings but so far I've found only four of them so... =|

Not so long ago I've experienced a 100% CPU+GPU peak during Eidolon fights (happened twice btw) which resulted in short microstutter, a single screen black flicker and Windows switching to basic for around 10 minutes or so. So I thought - in order to prevent this from happening again, I need to limit CPU usage somehow. Limiting the frame rate was the most obvious and simple solution. And I decided that, well, I don't need more than ~80FPS for games anyway so it wouldn't harm if I limit FPS globally. But lowering the refresh rate of the monitor (from 144 to 100) resulted in ghosting - like, for example, mouse cursor and stuff. Then I recalled that RivaTuner from MSI Afterburner allows you to lock FPS for every app or just for selected one.

But Warframe was still like:

Spoiler

XPI8hjV.png

And yesterday I've increased the FPS in Warframe back to 144 and noticed that for some reason CPU usage is the same both for 144 and 72 FPS-locks - which is ridiculous. After some testing I think Warframe just doesn't know how to work with/doesn't like RivaTuner FPS limiter - looks like the game still 'feeds' the CPU additional info even though it shouldn't.

Anyway, time for the screenshots. There will be three screenshots of the same situation but with different FPS limits - 144max in-game (IG), 60max in-game (IG), 60 max via RivaTuner (RT). Hope I didn't mess them up when inserting here.

Batman Arkham Knight. I was too lazy to mess with unlocking FPS via editing the game files but decided that even in-game limit of 90 will work just fine. So here we go:

90ig   60ig   60rt

Mass Effect Andromeda. 

144ig   60ig   60rt

144ig   60ig   60rt

The Witcher 3.

144ig   60ig   60rt

144ig   60ig   60rt

Basically, under normal circumstances, CPU usage with 60FPS is lower than with a higher limit (obviously) and there is small to none difference between in-game limiter and RivaTuner.

Warframe.

144ig   60ig   60rt

144ig   60ig   60rt

EXTREMELY noticeable huh? 

Don't know why or how.

And it would've been nice to see a couple of people doing the same with Warframe just to make sure it isn't just me but oh well.

 

 

Edited by ngrazer
Link to comment
Share on other sites

4 hours ago, ngrazer said:

-snip-

I already covered all the points provided above, so i'll just leave with this.

Basically, this is how you run warframe (with a frame limiter):

warframe.x64_2017_11_d9but.jpg

Whereas this is how i run Warframe (with a higher limiter and higher resolution - see my L4D2 note above):

warframe.x64_2017_11_e0ywc.jpg

The only settings changed are 'Resolution" and "Frame-rate", two settings that are entirely GPU bound.

The reasons why this is has already been provided above.

Link to comment
Share on other sites

On 26.11.2017 at 3:03 AM, MillbrookWest said:

I already covered all the points provided above, so i'll just leave with this.

Basically, this is how you run warframe (with a frame limiter):

What I meant by asking other people to test is in case of 120Hz monitor would've looked like this (any desirable graphics settings and resolution - but they should remain the same in all three cases):

1. No FPS limit, vsync is on - means ~120FPS (if you are able to hit those numbers, of course).

2. Disable the vsync in-game, change the built-in Warframe limiter to 60FPS (so that it would be half of your monitor refresh rate).

3. Disable the vsync in-game, change the built-in Warframe limiter to 120FPS (to match the refresh rate), set FPS limit via Riva Tuner/other third-party limiter to the same 60FPS.

4. Capture the screenshots with CPU and GPU usage in all these cases. 

What I wanna know is - is just Warframe doesn't like third-party frame limiters or is there something on my side. Because as you can see on my screenshots the CPU usage with in-game limiter set to 60 is X, with 144 it is 2X (understandable), and with Riva Tuner set to 60 it is also 2X (the heck) whereas it should've been just X.

Oh, btw, what are your specs?

 

Link to comment
Share on other sites

  • 2 weeks later...

Yeah I have this problem too though not on my 1060 6gb which was part of my old setup. SO let me start from there. My old PC ran a 1060 6gigs with an i5 4790k at 4.3 mhz on the core and just a basic 8 gigs of ddr3 at 1600 mhz. However the problem lies solely with my new rig. I'm running a r7 1800x at 3.8 mhz with a 1080 ti and 16 gigs of 3200 ram. I can play every game I own fine at 1440p with decent settings and full GPU usage. In warframe my GPU runs at it's base clock in very specific situations. Now what do I mean? Well specifically only when I play in matches with other players does this happen. When there are other players in the game my GPU struggles to push more than 50-60 frames at 1440p, that doesn't seem all that bad at first but during those moments there's a lot of micro stutter as the GPU usage fluctuates between 1540 mhz on the core to around 1700 mhz where as in most other games I own (even CPU intensive ones) the GPU always runs at or near 100% usage which lets it sit at a stable flat lining 2012 mhz on the core That's almost a 15-25% difference in core clock. For example I own and play AC origins which if you haven't heard is brutal on CPUs because of the layers of DRM. Even AC origin's in game monitor shows that my CPU is pegged at 60% to 100% usage on 11 cores (1800x has 8 physical and 8 simulated cores) but I get a smooth and stable FPS of 60-90 with few dips to the high 50s. Only on warframe does my GPU choke like this. FURTHER MORE any time I play solo the GPU ramps up to max usage like nothing's wrong and I get a consistent 130 or higher FPS. When I checked the CPU usage in MSI afterburner nothing is capping out so I don't think it's CPU bottle-necking. There are a few cores sitting at 60% or so but that's not a big deal. Nothing compared to AC origin's regularly hitting 80% to 100% on almost all 11 utilized cores. Now I didn't take pictures of my graphs so sorry you're gonna have to take my word on this one

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...