Jump to content
The Lotus Eaters: Share Bug Reports and Feedback Here! ×
The Lotus Eaters: Known Issues ×

Gpu Usage Of Ingame Settings


SoulSpeed
 Share

Recommended Posts

Hey guys,

I'm here to post some tests I've run just now (U14.6.1) of my Gigabyte Windforce R9 280x Rev2.0 with WF settings.

I'm using GPU Shark on my 2nd monitor to monitor the GPU usage.

These tests were only taken on the Liset while standing and used the kubrow as an extra.

DE can use this as an indicator where to improve performance.

People can use this as an performance guideline.

Anyways, on to the pictures :)

 

1: Settings I used to play before taking these tests.

GPU usage: 99%

One to note is that everything here is in Dx11 :)

 

2.0: On to Low settings and everything off

GPU usage: 40-60% + spikes

2.1: With ATI enhancements

GPU usage: Around 60% + spikes (vs 2.0)

 

3.0: Tested them 1by1 from Low until High

GPU usage: 58-64%

3.1: With ATI enhancements

GPU usage: Around the same (vs 3.0)

 

4.0: Added WE FX + Shadows (WE FX didn't do much)

GPU usage: 70%

4.1: With ATI enhancements

GPU usage: No change (vs 4.0)

 

5.0: Added Dynamic Lighting

GPU usage: 70-73% (1-3% increase)

5.1: With ATI enhancements

GPU usage: No change (vs 5.0)

 

6.0: Added Color Correction

GPU usage: No Change

6.1: With ATI enhancements

GPU usage: No Change (vs 6.0)

 

7.0: Added Bloom

GPU usage: Around 78% (5-10% increase)

7.1: With ATI enhancements

GPU usage: 1% increase (vs 7.0)

 

8.0: Added Anisotropic Filtering

GPU usage: 2-3% increase

8.1: With ATI enhancements

GPU usage: No Change (8.0)

 

9.0: Added High Dynamic Range

GPU usage: 85-87% (5% increase)

9.1: With ATI enhancements

GPU usage: 1% increase (vs 9.0)

 

10.0: Added Runtime Tessalation

GPU usage: 86-88% (1% increase)

10.1: With ATI enhancements

GPU usage: No Change (vs 10.0)

 

11.0: Added Anti-Aliasing FXAA

GPU usage: 94-96% (8% increase)

11.1: With ATI enhancements

GPU usage: No Change (vs 11.0)

 

12.0: Upped Anti-Aliasing SMAA

GPU usage: 98-99% (atleast 4% increase vs FXAA)

12.1: With ATI enhancements

GPU usage: No Change due already using max (vs 12.0 )

 

13.0: Local Reflection => Runtime Tessalation and Anti-Aliasing Disabled (9.0 = 85-87%)

GPU Usage: 99% (atleast 15% increase)

13.1: With ATI enhancements

GPU usage: No Change due maxed out

 

13 Local Reflection Update: Tested LR only (everything OFF and LOW)

GPU usage: Still 99% maxed out (40%+ increase)

Sorry no pic of this since I wanted to test it ASAP while I was writing all of the above

Edited by SoulSpeed
Link to comment
Share on other sites

Beyond CPU/GPU usage I see a problem in warframe. It seems that today we have several technologies to aid us, such as CrossFireX and SLI.

I have used CrossFireX to test out the frame rate. SURPRISINGLY at max settings I get a 100% GPU usage dropping sometimes to 92% only for a split second, both on and off.

Guess what? I got 30 more frames a second in certain angles WITHOUT CrossFireX. Which will imply CrossFireX is not supported.

Now before you go and criticize AMD's CrossFireX I will tell you for a fact, there was a significant 20 frames increase in a tech demanding game like Planetside 2 while having it on, on other games increases of up to 60 like in Bioshock Infinite for example(an AMD optimized game).

 

I would be thankfull to see Warframe supporting CrossFireX at some point because really at this point it would be much worth it to sell both of them to Ebay on second hand and use the extra money to buy a high end Nvidia alongside my already-in-my-pocket money.

Edited by DarkBabylon
Link to comment
Share on other sites

I always just turn bloom, motion blur, and depth of field off. They make it harder to see what I'm doing and they hurt performance. 

 

That said, most of my framerate drops appear to be caused by network issues rather than visuals. I mean, I can run Crysis at high settings at a stable 60+ fps, but in Warframe my framerate regularly drops from 80 to 40 or lower in multiplayer games.

Edited by Plasmaface
Link to comment
Share on other sites

I always just turn bloom, motion blur, and depth of field off. They make it harder to see what I'm doing and they hurt performance. 

 

That said, most of my framerate drops appear to be caused by network issues rather than visuals. I mean, I can run Crysis at high settings at a stable 60+ fps, but in Warframe my framerate regularly drops from 80 to 40 or lower in multiplayer games.

Done that...100 FPS increase! :O

But warframe looks like a medium quality plastic toy ;_;

However Depth of field and motion blur, definitely give a big bonus in personal gameplay performance when turned off.

Edited by DarkBabylon
Link to comment
Share on other sites

Beyond CPU/GPU usage I see a problem in warframe. It seems that today we have several technologies to aid us, such as CrossFireX and SLI.

I have used CrossFireX to test out the frame rate. SURPRISINGLY at max settings I get a 100% GPU usage dropping sometimes to 92% only for a split second, both on and off.

Guess what? I got 30 more frames a second in certain angles WITHOUT CrossFireX. Which will imply CrossFireX is not supported.

Now before you go and criticize AMD's CrossFireX I will tell you for a fact, there was a significant 20 frames increase in a tech demanding game like Planetside 2 while having it on, on other games increases of up to 60 like in Bioshock Infinite for example(an AMD optimized game).

 

I would be thankfull to see Warframe supporting CrossFireX at some point because really at this point it would be much worth it to sell both of them to Ebay on second hand and use the extra money to buy a high end Nvidia alongside my already-in-my-pocket money.

I would agree with you but unfortunately I can't. The fact is that with CF the problem lies in the drivers actually.

And since CF is only being optimized for AMD gaming labeled games, there's no hope for it.

SLI from nVidia is actually a bit wider but sticks more to the nVidia + AAA games.

IMHO it's cheaper to buy 1 High End card and upgrade along the years as game engines get more demanding. Not saying to buy every High End series that comes out. Like I skipped the HD7xxx series (from HD6970 to R9 280x)

We could enter this nVidia/ATI discussion again although i would refrain from it.

Just in short:

nVidia has more game compatibility than ATI

but nVidia's drivers have been lacking a lot in the last year or so unlike ATI's which has been improved drastically (just a shame of CF though)

 

I always just turn bloom, motion blur, and depth of field off. They make it harder to see what I'm doing and they hurt performance. 

 

That said, most of my framerate drops appear to be caused by network issues rather than visuals. I mean, I can run Crysis at high settings at a stable 60+ fps, but in Warframe my framerate regularly drops from 80 to 40 or lower in multiplayer games.

I'm not sure about this but my guess is that clients have to wait until they receive (more) data so this data isn't at a stable rate although it need to be processed asap to keep the sync, hence the fps drops. Also depends on how optimized it is on both sides, graphics/network code.

This is just guess though.

Although there are more reasons to keep into account like: clients UPL speed, host DL speed, NAT's, etc... you can go on on this list

 

Done that...100 FPS increase! :O

But warframe looks like a medium quality plastic toy ;_;

However Depth of field and motion blur, definitely give a big bonus in personal gameplay performance when turned off.

DoF and MB imo is just for those who really want more realism to their game which is quite nice for some games.

But for fast paced games like WF or FPS in general are just downsides, less performance and not efficient gameplay wise.

Bloom depends on the coded settings of it though. Most games are set way to high, hence being turned off.

Here in WF it's nice for the environment but a bit to much on characters/abilities. Just my opinion.

 

Anyways,

This is rather an indicator for those who have trouble with there GPU albeit it be performance or heat.

More GPU load/usage = more heat, even if your performance is still good enough to have stable fps.

Edited by SoulSpeed
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...