Jump to content
Dante Unbound: Share Bug Reports and Feedback Here! ×
  • 0

Tips To Improve Performance In The Game


Rabcor
 Share

Question

I've heard several reports of people having performance issues with the game. I'll start by quickly showing the settings i recommend (for the people who are impatient) and in the end of the thread i'll give a more detailed explanation on the less-important settings (mostly ones that are turned off in the recommended settings example).

These settings are meant to balance between visual quality, and performance, giving the best visual quality for the least performance cost.

I'm used to playing with all settings maxed out (apart from DoF and Bloom). I tried playing with the settings i recommended and i didn't really feel any difference (I can only see it if i compare)

 

 

List of content:

 

1. In-Game Settings

2. Windows (7) Settings and optomization

3. Graphics Card Driver Settings (Nvidia control panel, Catalyst control center and Intel's thing)

4. Description of the video options in the game.

In-Game Settings

Let's see, i will start with explaining resolution and field of view.

Resolution is in my opinion the most important option in any game. It'll scale the game (UI, Textures and all) up or down to your selected resolution. It's generally a good idea for quality to have resolution at your screens native setting (This is just the max available resolution usually) but it's also the most likely setting to increase your performance by lowering it. Play with it until you find out what's acceptable for you.

 

Additionally, as explained by https://forums.warframe.com/index.php?/topic/43711-tips-to-improve-performance-in-the-game/page-3'>Ecotox lowering the resolution too much may result in worse performance. Reason being that lower resolutions will put a heavier load on the CPU, more http://www.overclockers.com/forums/showthread.php?t=705646'>here if you're interested.

 

Field of View (FoV) sets the width of the camera (how far to the sides you can see), it does this by what feels like moving the camera further away (or closer to) from your character similar to how zoom works on real cameras, which will allow you to see more (or less) stuff on your screen in general. I have this option maxed out and depending on how DE coded it, it may have no performance hit at all so i recommend setting it to a value that you find comfortable, if you have a big screen it's generally a good idea to max it out, but if you've got your FoV set too high your "camera" will start to look http://www.dansdata.com/images/gz124/160.jpg'>"fish-eyed". (Kudos to DE for having this option, a lot of developers fail to include this in their games.)

Aspect Ratio can usually be left on auto i would think.

Texture Memory (more below in descriptions) is something i believe most people would be able to handle on high. If you have lag with your max resolution and recommended settings, i suggest trying to set it down to medium or low and see if you get any major performance gains (otherwise you'll most likely just have to reduce the resolution) good textures generally don't need high-end graphics cards in my experience. However if you're intergrated graphics (intel or some older nvidia/ati cards) with shared memory it is recommended to set this to low, as the higher you set it the more your shared memory will get used which might result in worse performance easily, the more shared memory is used the more prone it is to error.

(I would've placed the example images in spoilers, but the forums say that if i do that, i've posted too many images. The examples are taken with 1920x1080 resolution and FoV maxed.)

Recommended
REsh5dY.jpg
Example

Alternative 1
hjq9seG.jpg
Example

Alternative 2
c1wLTYm.jpg
Example

Alternative 3
IjBfymW.jpg
Example

Don't forget that under gameplay options you can select a region, select a proper region to connect to people closer to you to decrease chances of high latency (both while hosting and while not hosting) also on another note, if hosting you'll want to set your settings lower than the best you can handle. If you lag everyone will lag while you're hosting after all, and your computer is under the most pressure as the host. (I personally recommend avoiding to host if you don't have a good gaming computer)

 

 

Examples of the game with settings maxed

 

http://i.imgur.com/YIfelln.jpg'>Normal

http://i.imgur.com/A9ytt41.jpg'>With Bloom

http://i.imgur.com/7BcP1MU.jpg'>With Bloom + DoF

 

Sorry. I forgot to take one with Dof and no Bloom. Guess i'll have to live with it.

----------------------------------------------


Windows Settings

These tips will mostly be centered towards Windows 7 since i am unfamiliar with how Windows 8 works and have never tried it. If you're still using Windows XP after all these years i'll just assume that you know how to optimize it for performance. Many of these tips will work on all 3 Windows systems.

1. Set the power scheme to High Performance
-This will force your computer to utilize more of your CPU Power, it may minimally increase performance, this will be mostly useful when you're hosting a game rather than at other times. (in theory)

2. Clean your registry
-It's supposed to help with performance. CCleaner (linked) is the most commonly used program to do this but alternatives are Advanced System Care and Glary Utilities.

3. Defragment the hard drive Warframe is installed on
-This will most likely increase your performance. Especially if you haven't defragmented recently. I recommend using 3rd party software rather than the windows defragmenter to do this. Diskeeper is an excellent choice and i hear Defraggler is good too.

4. Disable unnecessary programs
-Well having extra programs running can hog your CPU power and RAM easily. that's bad, consider running the game without steam. To do this there is one quick way. Acess the Task Manager (Ctrl+Shift+Escape or Ctrl+Alt+Delete) and go to the processes tab. Find "explorer.exe" and right click it and press end process tree. This will turn off most running programs including the windows UI itself (desktop and taskbar) after this you go to the applications menu in the task manager and press "New Task..." and type in "explorer" or "explorer.exe" without the quotation marks.

5. Disable Aero(Optional)
-This should free up some CPU power, GPU power and RAM, but your computer will look uglier, you can use third party software such as "GameBooster" by Iobit to do this temporarily for you (along other things. if you don't know much about computers but need more performance in games, gamebooster can help)

6. Disable auto-scans on your anti-virus software while playing
-If your antivirus program starts scanning while you play you'll lag into pieces. how to disable it is something you need to find out yourself since there are many different anti-viruses.


Now you know plenty about maintenence for windows computers in general. Congratulations.(It's all in the registry cleaning and defragmentation)

Note: Defragmentation is not needed on an SSD at all. And if done with software that can't recognize the SSD as such it'll most likely just shorten it's lifetime.

----------------------------------------------
for Nvidia Users:

Nvidia Control Panel Settings

 

Make sure you have the http://www.nvidia.co.uk/Download/index.aspx?lang=en-uk'>latest drivers.

Right click your desktop and find "NVIDIA Control Panel"

Find "Manage 3D settings"

You here have 2 options

1. Set global settings (set settings for your entire computer)
2. Select "Program Settings" tab and add warframe (The Evolution Engine) and set the settings just for warframe.
note: if you mouse over "The Evolution Engine" you will see a file path. if it points to .../launcher.exe it's the wrong one, if it points to .../warframe.exe or .../warframex64.exe or similar, that's the right one.

Next, you should make them look something like this.
XkMEQ3x.png

Texture Filtering - Quality - Try to set it to it's max or lowest setting and see if you feel a difference. I don't.

Thats about all i can think of right now on nvidia control panel.

for AMD/ATI users:

Catalyst Control Center Settings
Make sure you have the http://support.amd.com/us/gpudownload/Pages/index.aspx'>latest drivers.

 

Right click your desktop, and find "Graphics Properties" or something that says "Catalyst Control Center" or something alike (i don't use amd a lot so i wouldn't know.) no promises that my control center looks like yours, but i think i can be pretty certain that all the options shown in my picture will be somewhere in your CCC, however to find them... i had to go to "Gaming" tab and find "3D Application Settings" so look for something similar. (I was in Advanced View)

http://i.imgur.com/ft4pKFY.png'>After that the settings you'll want to have are similar to these.

(Image limit per post on forums is 5, this would've been the 6th)
Texture Filtering Quality - Try setting it to the highest and lowest and see if it affects your performance, i don't know if it will but i also doubt that it will.
Tessellation Mode - You want Tessellation off, it's a very resource intensive feature from what i've experienced with it.

 

Intel HD Graphics Settings

 

Right click your desktop and find "Graphics Properties" access the "3D" tab and move the 3D Preference slider to Performance.

 

Or press "Custom Settings" and set everything to Application Settings and Texture Quality to Performance

Also go to the "Power" Tab and set Power Plans to Maximum Performance (if on a laptop remember to check what the "Power Source" is set to when you adjust this. if on battery you probably want maximum battery life over performance)
----------------------------------------------

 

Description of Settings

 

-- Anti-Aliasing:

http://en.wikipedia.org/wiki/Anti-aliasing_filter'>Anti-Aliasing is a feature made to smooth-edges in general anti-aliasing is a GPU intensive task, and on older cards it will almost definetly cause lag in most games. However, Warframe seems to use a recent AA Algorithm called FXAA which was developed by nvidia a couple of years back. This FXAA performs better than all other forms of AA i've seen, but it's quality isn't as great as that of multisampled or supersampled AA, it's quality is however good for a cheap performance price.

Here are some examples of what it does

 

Alternative 3

http://i.imgur.com/biqh5lk.jpg'>Alternative 3 + AA

 

Note: Anti-Aliasing does in fact affect performance, but it just so happens that in warframe it (should) affects it fairly little. After that it's all about whether you like this effect or dislike it. I personally think it's not worth sacrificing performance for since i like it when things are clearly defined just as much as i like them with smoothed edges.

 

-- High Dynamic Range:

http://en.wikipedia.org/wiki/High-dynamic-range_rendering'>HDR similar to bloom is a lighting effect that is designed to preserve details that may otherwise be lost due to limiting contrast ratios

Nvidia's explanation of HDR goes like this: Bright things can be really bright, Dark things can be really dark, And details can be seen in both. This effect is often used to simulate the effect of your eyes re-adjusting to environments (for example, enter a dark room and you'll see badly for a while, go back into a bright area and the light could be blinding)

HDR is said to be a GPU intense feature so disabling it may very well increase performance.

 

-- Local Reflections:

Local Reflections enables reflections, meaning that for example objects sitting in a corner may reflect off the floor depending on the lighting. It also causes lights to reflect off objects they touch. This is an effect you probably wont notice a lot if enabled, but disabling it has been said to slightly increase performance.

 

-- Dynamic Lighting:

Dynamic Lighting is a lighting effect that will mostly be noticed if you've got weapons with elemental effects on them as it allows lighting to actively change. Pictures will explain better.

 

The door turns red in one picture because theres a red panel next to it. Without dynamic lighting panels won't emit a light because panels can have different lights (so if lighting wasn't dynamic the panel could only have 1 color of light. or things would get a bit more complicated and ugly)

http://i.imgur.com/xw1N1dy.jpg?1'>Dynamic Lighting On

http://i.imgur.com/xhIBLyv.jpg?1'>Dynamic Lighting Off

 

Example of elemental effects (this one is taken by dukarriope, thank him for this one. it has DoF and Bloom and Color Correction on.)

 

-- DoF / Motion Blur:

http://en.wikipedia.org/wiki/Depth_of_field'>Depth of Field and https://en.wikipedia.org/wiki/Motion_blur'>Motion Blur are 2 seperate things not to be confused with each other, but this game uses the DoF effect to generate motion blur (i think it might be cool to be able to inable a little bit of DoF without enabling motion blur) Depth of Field is a "focus" effect, it's used commonly in movies to focus one area of the screen and blur the rest. Read up on the wikipedia link for details. Motion Blur should do 2 things

1. Blur Moving Objects

2. Blur your screen when the camera moves (since everything on the screen is moving while the camera moves)

 

This option can have a nasty performance hit and i recommend disabling it regardless of how good your computer is (because it just looks plain ugly in my opinion) an example of it's performance hit is that my friend was playing on his laptop at my place, and i noticed "hey dude why is your game so laggy? i thought you had a decent laptop" and then i saw that he still had this option on, i told him to turn it off. and as soon as it was off his lag quite literally disappeared. If that's not a huge performance hit you tell me what is!

 

-- Bloom:

http://en.wikipedia.org/wiki/Bloom_(shader_effect)'>Bloom is a shader effect that basically makes bright lights in the background bleed over onto objects in the foreground. This is used to create an illusion that a bright spot appears to be brighter than it really is. I don't think this is a very GPU intensive task. But it might have some performance hit. I personally hate this effect because it can be so blinding that's why i recommend it turned off.

 

http://i.imgur.com/VNLNAVQ.jpg'>Bloom ON(Alternative 1)

http://i.imgur.com/IMgtEjJ.jpg'>Bloom OFF(Recommended)

 

-- Color Correction:

Color Correction should have little to no performance hit the way i see it. This effect seems to create an "overlay" of sorts or a "filter" that will change the http://en.wikipedia.org/wiki/Color_temperature'>temperature of your colors, or something alike. A similar technique was commonly used by the http://skyrim.nexusmods.com/mods/11318/?tab=3&navtag=%2Fajax%2Fmodimages%2F%3Fid%3D11318%26user%3D1&pUp=1'>skyrim modding community to make the game look better with a minimal performance hit in it's earlier days. Some still seem to use that, i linked to a mod that emerged from that (the original thing was called post process injector)

 

It's mostly personal preference whether you want to use it or not.

 

Color Correction ON(Recommended)

Color Correction OFF(Alternative 2)

 

-- Character Shadows:

The way i understand this one is that it'll add shadows to moving objects (your character, and your oppponents) whether i also think it adds self-shadowing (you can cast a shadow on yourself) to the game, it's most likely a very intense task for the GPU and i recommend having it disabled unless your computer can handle it. this is the 2nd thing you should disable if you're having performance problems in a game (after anti-aliasing)

 

-- Nvidia PhysX Effects:

http://en.wikipedia.org/wiki/PhysX'>PhysX as the name suggests is an nvidia exclusive (last i checked) realtime physics engine developed by Aegia. It tries to simulate physics. Warframe seems to use it to make all sorts of particle effects and eyecandy. In the spoiler is a video to showcase it.

Note: PhysX is only eyecandy and does not affect gameplay. It will just look awesome.

 

-- Vertical Sync (Vsync):

http://en.wikipedia.org/wiki/Screen_tearing'>Vertical Synchronization is a feature that tries to make the games FPS match your monitors HZ (usually 60hz, the amount of hz can be considered your monitor's max suppored FPS) Triple buffering is an option designed to increase it's performance significantly in the event that you can't handle the desired 60FPS.

 

It's original purpose is to prevent "screen tearing" no matter if you've got a good GPU or bad GPU it's generally a good idea to have this option enabled, since it usually both improves the quality of your image and many people claim it also increases the performance of the game. (having your game run at 80FPS on a 60hz monitor doesn't gain you anything. making the game match your max 60FPS will make the image more fluid and smooth. and it also helps that the framerate isn't as prone to jumping since it's locked to your monitor's max supported framerate.)

 

http://hardforum.com/showthread.php?t=928593'>Complete Details Here according to which if you cannot play the game at 60fps without vsync, you should have no reason to enable vsync. (But feel free to try, remember that triple buffering must be enabled by your GPU though.)

 

-- Texture Memory:

Texture Memory increases the visual quality of the http://en.wikipedia.org/wiki/Texture_mapping'>textures in your game. Games usually stream the textures into your graphic card's RAM (VRAM/GDDR on dedicated graphics cards, intergrated graphics cards use your computers RAM/DDR) The higher you set the texture memory, the more memory it'll consume and the more quality it'll give. Most graphics cards today come with 1-2gb of GDDR which is plenty for this game's high texture memory setting, If you've got a card with 512mb or less it's recommended to set it down to medium, if you've got an intergrated graphics card then it is recommended to set it to low (to save up memory)

 

-- Shadow Quality:

Obviously it through some method increases the quality of your shadows. I don't know how and i haven't tested it but if you do have a description for it, post it and i'll add it here.

Increasing Shadow Quality is known to be a resource hog for your GPU. It will most likely have a very heavy performance hit to hae it on high, and it is recommended if you have a bad GPU to turn shadows off completely in some games. A good example of this is skyrim, run it with shadows on low and even the worst GPUS can handle it with all other settings maxed out. set everything to low but shadows to high and you will lag to pieces if your GPU isn't up for it.

 

----------------------------------------------

 

Thanks for reading, i hope you learned something and i hope more importantly that all this effort helped someone :)

Edited by Rabcor
Link to comment
Share on other sites

Recommended Posts

so basically it can use 2 threads but avoids it. I wonder why it would do that...

Good news is that the upcoming PS3 is going to have an 8 core processor. meaning that this unicore habit will probably start to change this year. don't know about the specs of the steambox and the upcoming xbox though. Seeing as they're to be released this year however i'm certain both will at least have a quad-core CPU

The PS4 has an AMD APU from what I recall, but it will be x86 (the same CPU architecture used in our "PCs"). So, yes, theoretically future games should be easier to port to PC.

The Xbox Infinity (rumored name) is also expected to use an AMD APU.

 

I decided to watch my CPU graphs while playing Warframe, and fixed affinity to not use CPUs 0~3. The CPU thread it did use only saw a max load of 95%. That means my total CPU usage never really goes over 26%. The game as a whole is lackig in efficient code, and I see this other Core i5-3550K machine in the house run the game mostly flawlessly. That CPU may be overall slower than my i7-3930K in the most demanding multithreaded loads, but it packs a higher frequency and clock efficiency, so it appears to handle this game better.

Link to comment
Share on other sites

The PS4 has an AMD APU from what I recall, but it will be x86 (the same CPU architecture used in our "PCs"). So, yes, theoretically future games should be easier to port to PC.

The Xbox Infinity (rumored name) is also expected to use an AMD APU.

 

I decided to watch my CPU graphs while playing Warframe, and fixed affinity to not use CPUs 0~3. The CPU thread it did use only saw a max load of 95%. That means my total CPU usage never really goes over 26%. The game as a whole is lackig in efficient code, and I see this other Core i5-3550K machine in the house run the game mostly flawlessly. That CPU may be overall slower than my i7-3930K in the most demanding multithreaded loads, but it packs a higher frequency and clock efficiency, so it appears to handle this game better.

 

 

what you clocked at? you should be able to hit 4ghz easy on that if you arent already just like my FX8120

Link to comment
Share on other sites

I run it stock. I originally had it clocked at 4.5GHz for the first few months, but that was occasionally unstable. For most part very few applications showed a significant improvement going from 3.2GHz to that much higher.

 

For most part, stock Turbo Boost is also driving the chip to 3.5GHz automatically anyway.

Link to comment
Share on other sites

I also always run stock. I don't see the point in overclocking processors when they're already this powerful stock, i don't really need the extra GHz, therefore i usually stick to buying non-overclockable intel processors (name doesn't end with a K) since i know i'm not gonna be overclocking anyways. All i see overclocking do is shorten the lifespan of my hardware, i'd rather it just lives long the way it is and maybe when it's old, if i don't have new i'd consider overclocking something. (But i also know i'll have something new before this gets old anyways)

Link to comment
Share on other sites

I upgrade every 2yrs, OCing isn't gonna shorten life enough to matter, and really as long as you start within voltage tolerance it's fine. The advantage of OC is increasing per core performance, single thread applications will see the biggest difference usually from an OC from my experience cause even though multi thread apps get the same boost it's less noticeable due to the fact they are running faster already, it's like trying to tell the difference between 1000 & 1010 mph visually lol

Edited by Ecotox
Link to comment
Share on other sites

I don't have a specific time to upgrade you see, i just upgrade when i start to feel that "i need to upgrade this"

 

Even if it's like a year old (apart from the GPU) my rig still seems to be good enough for another 2 years at least. Technology has been advancing pretty slowly over the last few years in the computer department, at least if you compare to earlier in 2000. Products don't seem to be advancing much to me, rather they mostly seem to be getting more expensive components making them a lot more expensive to buy. At least that's my perspective.

Link to comment
Share on other sites

(I usually don't double post but... thread was dying, wouldn't want that.)

I was helping my brother pick out the best parts for the most fair pricing today. and i discovered that.... my setup is still better than that even if it's 1 year old. Technology really isn't advancing much. his computer also cost just about as much as mine did. no price advancements and no better gpus or ram or motherboards, however the CPU was just a bit better (although my old one is still better his is a recent i5 and mine is an older i7)

Edited by rabcor
Link to comment
Share on other sites

You'll only shorten the lifespan if you go crazy on with voltage. If you get a good chip you can actually push a good overclock on stock voltages, some really good chips actually manage less! Also with the K CPU's you can just overclock the turbo rather than the core clock so that there is less strain on the CPU when idle. I personally bumped my turbo up to 4.2ghz on stock voltage.

Link to comment
Share on other sites

Better portable devices... screw that, although i'm considering whether or not i really should get an android phone. The android OS is pretty cool after all and theres some awesome things that can be done on it. I could even write my own apps for it which i can't for my current phone... hmm...

Link to comment
Share on other sites

(I usually don't double post but... thread was dying, wouldn't want that.)

I was helping my brother pick out the best parts for the most fair pricing today. and i discovered that.... my setup is still better than that even if it's 1 year old. Technology really isn't advancing much. his computer also cost just about as much as mine did. no price advancements and no better gpus or ram or motherboards, however the CPU was just a bit better (although my old one is still better his is a recent i5 and mine is an older i7)

 

tech gets faster every 2 yrs, a computer a year ago is pretty much a computer now, a computer next year will be faster. Tech goes in cycles, you have the new/faster stuff every 2 yrs, 1yr after a release is a shrink and improvement on that general design but not faster just more effiecient. My old computer is definitely slower than my one now that i bought almost 2yrs ago now. Wait till the new intel CPU come out that include new features, better IPC, etc. it may not be like "omg i have to upgrade" but it will be faster. I'm personally waiting for the next in the FX and radeon lines that are supposed to be coming late this year/early next year when my upgrade time rolls around. I base my upgrade times on moore's law, yes there will be a point where with the current tech e will stagnate to a degree but we wont even begin to reach that till 2016 at the earliest. I'm just gonna put my upgrade path over the past 6yrs here as a sort of example as to why i wait that time

 

1. Athlon64x2 5400

2gb DDR2 800

Geforce 8800gts 512mb

 

2.Phenom II x4 920

4gb DDR2 1066

Radeon HD 4870 1gb

 

3. FX 8120

8gb DDR3 1600

Radeon HD 6970 2gb

 

All those upgrades gave me a fairly significant boost in performance, if i had upgraded sooner the same could not be said :> 2yrs is min for an upgrade, otherwise just upgrade when you feel you eed too :D

Edited by Ecotox
Link to comment
Share on other sites

The thing is the computer industry is putting more R&D into making better portable devices not better desktops since that is were the money is at right now.

Yes in the money aspect, but AMD and Intel are still developing the desktop space similarly to how they always have. The change just seems more drastic in the mobile space because it's playing catch up on 15-20yrs of time. Desktop space hasn't slowed it just doesn't seem as drastic from a user stand point. Also keep in mind that even today alot of software doesn't take advantage of current tech, that doesn't mean the tech is slowing pace it means the software designers need to get with the times c:

Edited by Ecotox
Link to comment
Share on other sites

When the host lags, the clients lag?

What? What gives?

Picture this.

 

You're playing a game on your computer. this means that your computer is controlling everything that happens in the game. It most notably controls how the AI (your opponents) move and what they do, it calculates their decisions for you. 

 

Understood so far? good.

 

Now 3 players connect to your computer to join your game. Dependant on internet bandwidth and their distance from you they may have latency/ping, for example 200ms ping means everything will happen 200 milliseconds later on their computer than your(the host's) computer. This means that if the host's internet connection is too far away from the players, or if his bandwidth is too occupied (uploading torrents would be the most likely cause) the other players latency will increase to the point where everything is happening up to seconds later on their computer, and because of that a lot of mobs and players may be going to jump around on their screens. (This also works the other way around, their latency to you, the host computer is how long it takes for your computer to detect what happen which will then take the latency of the other players to send them the info that something happened) There are other things that can mess this bandwidth thing up (such as the infamous Strict-Nat problem) but lets not go into that.

 

Understood so far? good.

 

Now you should be able to picture why everyone lags if the host lags. it simply works so that the host is lagging usually because his computer can't handle everything that is going on. It might be it's having a hard time keeping up with all the actions of other players, or it might be that there are just too many mobs it's controlling which will result in that it will be slower to do it's math on what the mobs are supposed to be doing. Now that it's slower on the host's computer, since all other players rely on the host to send them the info about what's going on both among other players and the mobs, every other player, since the host is having problems processing all the info it is receiving, the lag on the host's computer will mean that other players will see their allies move most likely much later than they performed them, and the mobs will probably have a tendancy to freeze up or be unresponsive (for example a grineer is shooting at you, and you can run behind it because it takes too long to turn) all sorts of unpredictable problems can show up when the host computer lags. This is why the host must have good hardware and good internet (always make the best computer with the best internet in your friends group host basically)

 

As you can see, host lag is not so different from connection lag. but i think host computer lag tends to be more forgiving than connection lag. (Mostly because the host can't lag as badly as the connection can)

Edited by rabcor
Link to comment
Share on other sites

This thread will certainly be quite helpfull to lot of ppl, thanks for doing it ;)

 

I had a question about Ambient occlusion ... couldnt find on the internet if the Evolution Engine used by DE for warframe uses or support Ambient Occlusion.

I could try to force drivers usage with nvidiainspector, but feel it could be better to ask first on forums.

Your post seems the right one to do ^^

Link to comment
Share on other sites

This thread will certainly be quite helpfull to lot of ppl, thanks for doing it ;)

 

I had a question about Ambient occlusion ... couldnt find on the internet if the Evolution Engine used by DE for warframe uses or support Ambient Occlusion.

I could try to force drivers usage with nvidiainspector, but feel it could be better to ask first on forums.

Your post seems the right one to do ^^

Don't know for certain but no harm in testing it out. Give it a go and see, if you don't notice a difference turn it off c: tbh it may but the engine needs alot of work and even some things it's supposed to support don't work right. I.e DX11 doesn't do much but ruin the lighting right now adding flickering and making the light overly bright (at least that's all I see)

Link to comment
Share on other sites

 

 

Now you should be able to picture why everyone lags if the host lags. it simply works so that the host is lagging usually because his computer can't handle everything that is going on. It might be it's having a hard time keeping up with all the actions of other players, or it might be that there are just too many mobs it's controlling which will result in that it will be slower to do it's math on what the mobs are supposed to be doing. Now that it's slower on the host's computer, since all other players rely on the host to send them the info about what's going on both among other players and the mobs, every other player, since the host is having problems processing all the info it is receiving, the lag on the host's computer will mean that other players will see their allies move most likely much later than they performed them, and the mobs will probably have a tendancy to freeze up or be unresponsive (for example a grineer is shooting at you, and you can run behind it because it takes too long to turn) all sorts of unpredictable problems can show up when the host computer lags. This is why the host must have good hardware and good internet (always make the best computer with the best internet in your friends group host basically)

 

In most of situations yes, you are right but i've noticed that when i'm host with 4 players and like 40 mobs in front of me the fps goes as low as 12-20, when i'm not host i haven't ever seen under 35 fps which is huge difference, though i believe it depends on how the game is coded because i've asked plenty of squads whille i was host and had very low fps if theier game was smooth (connection wise) and they said everything's all right, of course.. people from europe that aren't too far from me. I believe that this game is coded to reserve a certain core or percentage of the cores for the people connected to me and the rest for my game.

 

In older games like CS if you host a server and players connect to you, they run smoothly.. but once you start playing, your client will have higher priority, and the server will eat only the "leftovers" that's why players are having a hard time. IMO warframe is doind exactly the oposite.

 

Most probably the performance will improve for gpu and they will fix the bugs as for the cpu... i'm not expecting too much, the game currently has support for 4 cores, but uses only 2 cores in terms of raw "power" e.g. (two cores on 60% and the other two on 40%) To make the game use all the cores would mean that they should rewrite significant part of the game code, which i highly doubt will happen.

Edited by Hatr
Link to comment
Share on other sites

i am waiting for the new consoles to arive to upgrade my pc or maybe get one of those consoles too.

 

what im most eagerly awaiting are those APUs that are now being developed. let's see what those APUs will bring to our pc gaming community. having the same or even better hardware with the same architecture that the consoles are using makes porting stuff a lot easier and thus even better qualitywise.

Link to comment
Share on other sites

In most of situations yes, you are right but i've noticed that when i'm host with 4 players and like 40 mobs in front of me the fps goes as low as 12-20, when i'm not host i haven't ever seen under 35 fps which is huge difference, though i believe it depends on how the game is coded because i've asked plenty of squads whille i was host and had very low fps if theier game was smooth (connection wise) and they said everything's all right, of course.. people from europe that aren't too far from me. I believe that this game is coded to reserve a certain core or percentage of the cores for the people connected to me and the rest for my game.

 

In older games like CS if you host a server and players connect to you, they run smoothly.. but once you start playing, your client will have higher priority, and the server will eat only the "leftovers" that's why players are having a hard time. IMO warframe is doind exactly the oposite.

 

Most probably the performance will improve for gpu and they will fix the bugs as for the cpu... i'm not expecting too much, the game currently has support for 4 cores, but uses only 2 cores in terms of raw "power" e.g. (two cores on 60% and the other two on 40%) To make the game use all the cores would mean that they should rewrite significant part of the game code, which i highly doubt will happen.

Everyone's FPS will not drop when the host lags. the stuff in their game (mobs and other players) will simply appear to be lagging. Their FPS is solely based on the game's code and their computer specs. Borrowing spare processing power from the connected clients is however a good idea and might be able to prevent host computer lag. But so far i haven't seen the game do that.

 

I also doubt they're gonna rewrite significant parts of their engine. but look at how successful this game is, and look at how good the game engine is even without proper multithreading. They just might be considering it. it's a pretty risky thing to do though and it'l take time. But if i was them i'd seriously consider it. I'd come to one of 2 conclusions, 1 being to do it. The other being to make a new game engine for the next game i'll make and do everything i can except rewriting huge sections of my game engine for this one.

 

(I'm sorry! i was lazy when quoting! too bad!)

 

This thread will certainly be quite helpfull to lot of ppl, thanks for doing it ;)

 

I had a question about Ambient occlusion ... couldnt find on the internet if the Evolution Engine used by DE for warframe uses or support Ambient Occlusion.

I could try to force drivers usage with nvidiainspector, but feel it could be better to ask first on forums.

Your post seems the right one to do ^^

 

I don't know if the game supports Ambient Occlusion. I have a feeling that they'd include it in the options if it did. But you can try to see if it does anything. If you see it have any real effect i'll add a line to the main post.

Edited by rabcor
Link to comment
Share on other sites

Nooo don't die thread :( speaking of which, mods have you considered pinning this thread? Too useless to be pinned maybe? I'll try to make this the last time i inetentionally bump this thread (regardless of whether you'll pin it or not).

 

--

 

My laptop has an AMD APU and i personally hate it and the switchable graphics it makes possible(opens tons of new opportunities for overheating). I don't have any high hopes for APUs at all.

 

What i have high hopes for are SSDs and the possibilities they bring. I look forward to their prices being brought down. I'm waiting for the next gen RAM  aswell, DDR3 is getting older every day.

 

Look at http://en.wikipedia.org/wiki/PlayStation_4'>ps4 specs

 

8 Core processor (even if it's amd) and 8gb GDDR5 RAM

 

 

What i'm skeptical about is the AMD Radeon GPU intergrated into the APU. That doesn't look promising to me in any way.

Edited by rabcor
Link to comment
Share on other sites

Nooo don't die thread :( speaking of which, mods have you considered pinning this thread? Too useless to be pinned maybe? I'll try to make this the last time i inetentionally bump this thread (regardless of whether you'll pin it or not).

--

My laptop has an AMD APU and i personally hate it and the switchable graphics it makes possible(opens tons of new opportunities for overheating). I don't have any high hopes for APUs at all.

What i have high hopes for are SSDs and the possibilities they bring. I look forward to their prices being brought down. I'm waiting for the next gen RAM aswell, DDR3 is getting older every day.

Look at ps4 specs

8 Core processor (even if it's amd) and 8gb GDDR5 RAM

What i'm skeptical about is the AMD Radeon GPU intergrated into the APU. That doesn't look promising to me in any way.

They're actually very promising in many ways. Things are starting to look toward GPGPU computing and integrating it on die helps ease that transition. It allows for faster communication between GPU and CPU and isn't really any more detrimental heat wise. It also reduces motherboard complexity. AMDs next been gen APU have many improvements to CPU and GPU communication, allowing them to talk directly and share data which can drastically improve tasks utilizing both. Also allows both to access memory equally without having to dedicate a static section of it to the GPU. The GPU included on die depending on the APU can be fairly beefy too, some getting in the range of upper Radeon 4000 series performance which is nothing to turn your nose up at c: Edited by Ecotox
Link to comment
Share on other sites

I don't really see a huge breakthrough for APUs in gaming in the coming years. They'll probably be great price/performance wise for entry level gaming, but that's about it. It'll be a long while still before the APUs will come even close to the high end discrete GPUs in performance and it's not like the discrete cards will stop advancing in tech any time soon. Something I'd also be interested to see is fully utilizing the APUs alongside discrete graphics hardware.

 

But having more simple and affordable options for PC gaming can only be a good thing for everyone. Regardless, I'll be interested to keep an eye on where the tech is going in the near future.

Link to comment
Share on other sites

Interesting. But just like Inglu said, those APUs have nothing (at least not yet) on the huge dedicated cards we use in our desktops. Size matters.

 

Actually i've been wondering for years now why they don't make bigger CPUs. It wouldn't be that hard to make motherboards with 4x the socket size and a processor to match which would probably be both cheaper to make and better aswell in the end. you can fit more components more easily into a bigger product after all and most of your research wouldn't go into trying to fit more stuff into a smaller container.

 

The graphics card companies don't seem to hold back very much on the size (as we can all see) but Intel and AMD try to make things as small as possible.

 

Now that theres such a huge distinction between mobile devices and desktops. why not allow desktops to make some use of their size? it's going to fit in the box.

Edited by rabcor
Link to comment
Share on other sites

Interesting. But just like Inglu said, those APUs have nothing (at least not yet) on the huge dedicated cards we use in our desktops. Size matters.

 

Actually i've been wondering for years now why they don't make bigger CPUs. It wouldn't be that hard to make motherboards with 4x the socket size and a processor to match which would probably be both cheaper to make and better aswell in the end. you can fit more components more easily into a bigger product after all and most of your research wouldn't go into trying to fit more stuff into a smaller container.

 

The graphics card companies don't seem to hold back very much on the size (as we can all see) but Intel and AMD try to make things as small as possible.

 

Now that theres such a huge distinction between mobile devices and desktops. why not allow desktops to make some use of their size? it's going to fit in the box.

GPU are about the same size as CPU, the majority of size for Graphics Cards is for heat dissipation aka, heatsinks. (ive done custom heatsinks, not much under the hood physically on graphics card)

Bigger is not better in the computing world, smaller = less distance for signals to travel, lower voltages will then achieve same or better results, less materials used, less heat, more can be put into a given die area, etc. Bigger is generally worse when it comes to computers. Same goes for servers, the smaller the clusters and the less heat they make the more you can pack into it and more power you get and the less energy they use saving money and reducing cost of service. Making it bigger would not be a good idea. They use to be fairly huge by comparison back in the 90s, my AMD k5 was a monster compared to my FX8120.

Also APU are not designed to compete with high end systems, APU are meant to bring mid range performance to the masses which they do fairly well. Also, they are meant to encourage developers of software to begin taking advantage of newer tech.

Edited by Ecotox
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...