Jump to content

pyr0ball

Hunter
  • Posts

    22
  • Joined

  • Last visited

Posts posted by pyr0ball

  1. Update from DE Support! In my support ticket, they suggested I try a hotfix from nVidia: http://nvidia.custhelp.com/app/answers/detail/a_id/3698

     

    Not sure why it's relevant since it's labelled as a fix for Sony Vegas, but I'm gonna try it out when I get home tonight.

     

     

     

    This should be obvious:

    Laptop Max FPS 90. This is all the GPU can attain with little to render.

    Desktop Max FPS 300. This is all the GPU can attain with little to render

     

    Enter a stage with actual code to process (NPC's moving around, players interacting with stuff, talking to each other, dancing, jumping off walls and other such tomfoolery)

    Both GPU's effectively equalize.

     

    CPU utilization stays similar on both PC's.

     

    DE have said that coding for console will make the PC version a better game. There is little to do outside waiting for DE to improve their code. DX12 should help with the render pipelines and should give high end players a little boost as that practically makes the render pipeline fully mulltithreaded.

     

    I did consider that implication, however what didn't make sense about it to me is that my laptop is still getting better framerates in the stage where there's a lot of code to process. Let's ignore the GPU differences for this instance and focus only on the CPU's, since we've determined that the graphics capabilities of either system are not ever being completely saturated by Warframe:

     

    Here's a hardware benchmark comparison between the two: http://cpu.userbenchmark.com/Compare/Intel-Core-i7-4710HQ-vs-Intel-Core-i7-3770K/m11499vs1317

     

    Here again is the framerates observed on both systems at the same location (mercury relay) during the same time of day:

    Desktop: 1920x1080@61hz = 126,489,600 pixels per second

    Laptop: 2880x1620@70hz = 326,592,000 pixels per second

     

    I think the pixels per second is probably more related to the GPU, so I'll just do the percentage comparison between framerate. That works out to a 13% increase in performance with a CPU that should be performing 20-30% slower than the desktop.

     

    I realize programming isn't as cut-and-dry as these numbers, I just wonder why this is happening.

     

     

     

    I have similar utilization and thing is it Is Multithreaded, the main calling thread will take more load overall but still will not oversaturate the CPU.

    So that bottleneck is ruled out.

    I think Warframe runs more stable while you use more pixels but that is rather weird, at least try to run your pc at that resolution.

    There is a way to do this virtually so it downsamples your game.

    Here a forum post about it: http://www.neogaf.com/forum/showthread.php?t=509076

    I am going to try this out, hoping it gives me more stable frames.

     

    Oki I'll give this a try as well. Is this similar to GeDoSaTo actually. As a matter of fact, I may not need to use this utility at all. My monitor displays a 4K signal input, which is basically hardware downsampling.

     

    I'll give both a try and post results

  2. OK I just did a little test for the sake of consistency and accuracy in my statements. 

     

    On desktop I'm seeing a drop in performance from 300fps down to roughly 60fps in Relay using a desktop class CPU (i7-3770K)

    On the laptop I'm seeing a drop in performance from 90fps down to roughly 70fps in Relay using a laptop class CPU (i7-4710HQ)

     

    As far as my hardware knowledge goes, mobile cores are usually not as beefy performance wise as desktop ones, and Intel has really abstained from increasing performance between Sandy Bridge and Haswell, instead opting for greater energy efficiency. So what's mindboggling to me here is that I'm seeing an 80% drop in performance on desktop between Liset and Relay, and only a 23% drop in performance on a LAPTOP, with a lower performance CPU, and a SINGLE GPU that has roughly the same equivalent performance as a GTX960, while simultaneously pushing roughly 13% more pixels per second (pixel percentage math below)

     

    Desktop: 1920x1080@120hz = 248,832,000 pixels per second 

    Laptop: 2880x1620@60hz = 279,936,000 pixels per second

     

    I get what you all are saying about Warframe being CPU dependent, but something is wrong with these numbers

  3. Wow this blew up while I wasn't looking!

     

    Yeah I take your point that you probably should be reaching your laptop's performance if not more. The one thing I can think of right off the bat is that you have a multi-monitor setup (by that I believe you mean warframe itself is being played across multiple monitors right?). If that's the case, then try playing on one monitor and see if it's better because warframe has had lots of issues with multiple monitors in the past.

     

    For clarification, I do have multiple monitors, however I restrict my gaming to one and run that at 120hz, 3D sterioscopic is disabled

     

    Warframe Has same issues on my system and it is a Beast, thing is warframe never fully utilizes my CPU nor my GPU.

    In the liset it does utalize fully but according to a game dev I know they have limited the engine due to the easy pc to console port they use.

    So there is a reason why it does this, and this should be optimized for pc and optimized for console independently.

    They should have optimized the engine like the CryEngine and that fully uses your GPU and a fair part of your CPU for AI and other miscellaneous things for the game.

     

    A nice addition they could do and I said this earlier in other posts, so like if a friend is joining your session and you are the host, your computer needs to calculate all graphics and on top of that all AI movements and hitscanning. This is performance demanding and demands a part of your cpu.

    So my idea is that distribute a part of the mobs to other players and make them also calculate and detect hitscan and commit this to the host and other players, on this way it is a bit more network demanding but will make the game more smooth even if you have a lagging host, because the clients could detect and decide to take over a part of the hosting to make the mobs more snappy than totally letting it do by the host.

    I know this is hard to do but it is possible and some games that do not have dedicated hosting for their games also do this  to make lag and load better overall.

     

    Things they also could do is utilize more gpu gram that is available I have seen games using more and performing better.

     

    I know that the main portion of warframe has bad computers and uses laptops that are not meant for gaming but this is not an excuse for bad performance on High end gaming rigs.

     

     

     

    Chao, The Roaring Lion Warlord of Shadow Lords

     

    Your explanation here makes a lot of sense on the surface, however it doesn't account for my laptop's performance being better than desktop.

     

     

    The Cryengine, Frostbyte engine, Unreal Engine 4, AnvilNext Engine; They are all designed from the ground up to be multithreaded aware.

     

    Warframe is still a very single threaded game, as evidenced by it's support for DX9, as such things in warframe still happen to some degree of linear order.

     

    To give an example (a bad one, warframe isn't like this at all), In a single threaded application, if your CPU got stuck moving an enemy from position X to position Y, the entire game would have to wait until that process had completed - Nothing else could happen until the enemy unit had been moved. The GPU couldn't render any new frames since no jobs were being queued for it to work on by the CPU. Only once the enemy unit has been moved, can the rest of the code execute and queue up jobs for the GPU to work on (amongst other things).

    In a multithreaded system, if that same enemy got stuck processing on the main core, you could offload code that wasn't dependant on the enemy's results to the next core (and any other core the system had). So while the enemy might still get stuck, you might still be able to walk around and do things without having to wait for the enemy to finish moving.

     

    To put this into perspective, if your render thread runs on the same core as the aforementioned stuck enemy; when he gets stuck on a system not multithread aware, the render thread gets stuck too. In this example, you are CPU limited, with the CPU bottlenecking your GPU since it can't supply the information needed for it to render successive frames.

     

    So in the case of the OP's GPU usage, the CPU isn't able to supply enough data to feed the GPU. It is getting stuck doing other things and the GPU is having to wait for it. That's not an indication of a bad CPU, it merely points out that the code is...at it's limits (i guess you could say?)

     

    IMHO (and this is my opinion) the base engine and code for Warframe hold it back. Ironically for high-end PC users. Warframe being coded for console makes the PC version perform better. DE being able to work on optimizing for the consoles should mean better performance for PC's since multithreaded coding is almost a necessity for consoles.

     

    Shouldn't a core on my CPU be saturated if a single thread is bottlenecking? My overall CPU usage is sitting between 15-21% and a single core never really goes above 60% utilization.

  4. I got some screenshot examples of the issue.

     

    Here's an example from in the Liset, where utilization is high as well as framerate:

    http://i.imgur.com/G2ebp4u.png

     

    The blue highlight is where I was loading in, and the red highlight is the high GPU utilization while in the Liset.

     

    Now here's a screenshot in the Mercury relay:

    http://i.imgur.com/Qoj49JB.png

     

    Blue highlight is while I was in he Liset, the two little dips are the loading screen and when I skipped the cinematic, and the red highlight is from the cinematic sequence up til I was in the Relay itself. 

     

    Note that the FPS has dropped from 300fps down to 60fps, and the utilization is down from 60% to 25%. Now in all my years of gaming I've never once seen this behavior apart from in Warframe. Usually FPS drops occur when the GPU is getting over-saturated and can't keep up, but in this case, the FPS is dropping because the GPU isn't being used to it's full potential. 

     

    Throw me a bone DE!

  5. So about a month and a half ago I upgraded my HD Radeon 7870 Ghz Edition to a MSI GTX GeForce 970 Twin Frozr V and so far it's great, the only problem was that I get stuttering when I play for a certain amount of time on the GTX 970 but the 7870 ran the game with absolutely no problem. I am aware that the GTX 970 has only 3.5 fast Ghz, and that's actually when the stuttering happens is when the game his 3.5 Ghz of usage. This is no a temperature issue, my i5-4670K runs about 50-60C while under load and my GPU runs about the same, 40-60C under load so this is not an issue. I another problem is sometimes I will dip down to 40 from 60 and it will just happen randomly, also when that happens my GPU and CPU temps are fine, so I have no idea what could be causing my frames to drop so dramatically, I also have V-Sync on. Also to note that I do have Google Chrome, iTunes, Steam, and Skype running while I game.

     

    I assume you're referring to the 3.5GB vs 4GB memory utilization issue on the GTX970's

     

    Just a heads up, the memory utilization on Warframe never hits that limit. It's simply not big enough. The memory usage you're probably seeing going up that high is your system RAM. If you're running 3.5gb of ram on your system then yes, you probably will encounter many issues relating to usage. Good news is system RAM is one of the cheapest and easiest upgrades you can do.

     

    Also keep in mind if you're running a 32bit operating system, you've already reached your cap. You will need to install a 64bit OS in order to take advantage of 4GB+ of RAM.

  6. I'm having a severe throttling issue on my desktop. What I've noticed is that as the GPU demand goes up (when the graphics requirements should increase, e.g. Void) the GPU utilization and FPS both drop, while in situations that require very little GPU demand (e.g. on board the Liset), the utilization and FPS appear uncapped.

     

    System details:

    Windows 8.1 Pro x64

    EVGA nVidia GTX 970 ACX2.0 SC x2 SLi @ 1448mhz OC

    Running latest drivers (353.30 currently, started observations back at 348.xx)

    i7-3770K @ 3.8ghz (unparked)

    32GB DDR3 1600mhz

    Acer HN274H 120hz Monitor @ 1920x1080/120hz

     

    Warframe settings: 

    Borderless Fullscreen (due to multi-monitor setup)

    vSync off (for test case, usually I use nVidia Inspector to limit FPS to 120hz)

    DX11 only

    x64 client

    AntiAliasing OFF

    All other settings set to High

     

    Test case examples:

     - On Board Liset:

      *FPS 350-450 (when no vSync/Framelimiter)

      *GPU Utilization: 86/90%

      *GPU Temps: 68/71°C

     - In Void:

      *FPS: 45-75

      *GPU Utilization: 38/45%

      *GPU Temps: 55/59°C

     

    I've been struggling with this issue for a few months now, and have a lot of data and test points to go from, but the ones above are a good example of what I'm seeing. What I don't understand is why the GPU utilization is going down when it aught to be staying high and keeping a higher framerate. I've heard of some pretty heavy throttling issues with the Maxwell GPU's, but I did some vBIOS tweaking that should have eliminated that as a factor, and the GPU temps are never at a threshold they would start throttling as far as I can tell.

     

    Another weird point is that this issue does NOT seem to happen on my laptop's Maxwell GPU (970m), which runs at 2880x1620 and gets solid 60+ FPS all the time, even in void, with the same settings as above, except the Fullscreen

     

    Is this a warframe optimization issue? Is it an nVidia optimization issue? Where should I start poking?

     

    Thanks in advance

     

    -Pyr0ball

  7. - i don't remember how reliable the Control Panel is. i haven't used it in a long time, i only use Inspector's Driver level Profiles. i presume those are the same Profiles the Control Panel uses, but maybe they're not.

     

    Using nVidia Inspector I was able to set a max FPS to 120 (still doing 127fps instead but thats fine) and getting no slowdowns when I get blink notifications now!

     

    I'm still gonna report this as a bug so the dev team can take a look at it. Workarounds are nice but not everyone is savvy enough to adjust things at the driver level

  8. this sounds silly, but that sounds like your computer is Syncing to the Taskbar Animation when it appears.

    i may suggest forcing VSync in a different manner in your Video Drivers.

     

    I tried forcing this in the nVidia Control panel (Manage 3D settings) but it seemed to not override the game's settings. I'm still getting 270-300fps (in Liset) with all of the recommended settings. I also tried using VSync ON (as opposed to Adaptive) as well as a couple other tweaks. 

     

    Something else I'm noticing is that Warframe doesn't appear to be utilizing my second card at all. When I'm in a very heavy load environment (excavation or void with lots of enemies) the first card is only hitting about 45-60% utilization and the second is only hitting about 3%, but my framerates are taking a major hit (between 45-75fps)

  9. Ok ran a test with V-sync disabled.

     

    Screenshot with no blink shows just under 300fps but it's fluctuating between 260-290:

    http://cloud-4.steamusercontent.com/ugc/535138812652614814/63849BE405FD0D7017D45E99EFCE3B3A9DEB52BC/

     

    With the blink active It's very similar in framerates, so im more sure that it's got something to do with the v-sync frame limiting:

    http://cloud-4.steamusercontent.com/ugc/535138812652626554/1343F712762774D28615BF01B5B9CF81F153FA5C/

  10. i get something similar to this whenever i get a notification via skype or other programes that want my attention... i found turning off notifications seems to work for me. but seeing as how i check all my stuff frequently it might not suit those who rely on the notifications for everything... (this is what has worked for me i'm unsure if it may work for you)

    I could do that yes, but if there's an issue I figure it's better to report it. I've been living with this issue for months and months before I really buckled down and started testing in earnest. I also realize this will probably effect a small subset of people as I'm 80% sure this is a bug that's causing the v-sync rate to drop to 60 when the blink comes up. I'm gonna disable v-sync and test again to confirm

  11. I run a 120hz monitor and very high-end setup to drive it. I noticed that whenever there's a "blink" notification on one of the windows running in the background, my framerate drops to 60hz. It's almost as if the v-sync frame limiting is being toned down when that happens. According to GPU-z, my load is never higher than 45%

     

    Here's the system specs:

     

    Windows 8 x64 Pro

    Acer H274HN (x3 in extended mode, Primary is the only one running 120hz)

    nVidia GTX 970 2xSLi 

    Intel i7-3770K @ 4.2

    32GB 1600mhz DDR3

     

    This is the notification blink I'm talking about: http://i.imgur.com/2f23SVQ.png

     

    Edit: I also forgot to mention, I run in borderless windowed mode because of multi-monitor setup and lots of multitasking

  12. I've had several instances where after the host leaves during an Archwing interception, the people who remain have major issues. Either the game crashes altogether (most common) or other things go wrong. One of my squad members was unable to see all the enemies even though they were shooting him, while I (in the same match) was unable to ascend or decend. I was literally stuck at one elevation for the remainder of the game and kept drifting downward little by little.

  13. I saw something about the G15 in the archive when looking up info about this, so I know this has been brought up before, but something that wasnt addressed I believe would be massively useful would be offloading chat text to the keyboard's LCD.

     

    It's a pain having to open and close the chat window all the time in the main menu, and for people like me, who have really high DPI (running a 2560x1600 monitor) it can be a little hard to see the text on-screen and furthermore, we have to switch between various tabs. The Logitech LCD works based on text output for a lot of the apps that run on it, so I'm sure getting the chat output to display there would be a cinch. The little arrow buttons to the side of the LCD be able to quickly and efficiently switch between chat channels as well, which would allow us to avoid needing to pause, click the tab, then start typing.

     

    All this was prompted by Black Friday delivering me a nice new shiney G19, so I'm eager to get some decent use out of it, and since warframe is my favorite game right now, I figured I'd poke you guys about it :)

     

     

     

×
×
  • Create New...