Jump to content
The Lotus Eaters: Share Bug Reports and Feedback Here! ×

RTX... will it arrive to Warframe?


GeekHotfix
 Share

Recommended Posts

dude, honestly RTX is garbage, in fact every single post processing effect made by nvidia is a resouce hungry gimmick, they implement those on AAA games to have an excuse for people buying  high-end hardware.

you can perfectly play any game with max graphics with a GTX 1050 ti or even a GTX 1070 (if you want to use your television instead of a monitor) any hardware more powerfull than that is for unoticeable gimmicks like nvidia hairworks or RTX, or physx

same goes for 16-18 core processors, you never get to use the full potencial of those unless you are running a server

Edited by Toppien
  • Like 2
Link to comment
Share on other sites

10 hours ago, Tenno1978 said:

This may have been covered before. but is there any plan on implementing RTX technology for Warframe??

RTX specifically is a no. Steve stated that DE have moved away from tech that is locked to a specific vendor: or in this case, locked to a specific series from a specific vendor. 

Ray tracing isn't off the table, but its probably a long way off until, at such a point, all vendors support Ray tracing of some sort. For example: DE use tessellation extensively in the valis, and at one point this was in a similar position by requiring fixed hardware. 

 

  • Like 3
Link to comment
Share on other sites

16 hours ago, Toppien said:

you can perfectly play any game with max graphics with a GTX 1050 ti or even a GTX 1070 (if you want to use your television instead of a monitor) any hardware more powerfull than that is for unoticeable gimmicks like nvidia hairworks or RTX, or physx

This is just wrong.

A GTX105ti is a really good budget card but it cannot "play any game with max graphics". The GTX1070 should be able to play most games at max graphics, but the 1050 only has 4GB of RAM so there are a few games out there where it can't load the textures and can't play the game at max graphics. The 1070 is the first card in the nvidia line that has 8GB of RAM across the board (The GTX1060 has a 6GB version that should do fine for most games at max graphics as well) and should have enough RAM to load the textures into the graphics card. The GTX1080ti and RTX2080ti both have 11GB of RAM but for 99% of uses, 8GB should be enough to run a game at max graphics settings.

  • Like 1
Link to comment
Share on other sites

17 hours ago, peterc3 said:

Why would they implement ray tracing when the only hardware available is from a single vendor and the performance hit is still astronomical?

It's gotten better. This could be damning with faint praise, but the newest entry on the RTX market--the RTX2060--is actually a very good performer and doesn't have a horribly inflated cost of admission. 

The main caveat is it's aimed at 1080p gaming and although it does pretty well in 1440p, it's totally out of its depth at 4K. 

This brings up hopes for a much more realistically priced 21xx series of RTX cards. Unfortunately, it's going to need that or the tech is going to wither on the vine since no one will implement ray tracing into games when practically no one owns the cards that enable it. We still only have one game and some promises. 

 

Link to comment
Share on other sites

16 hours ago, Toppien said:

same goes for 16-18 core processors, you never get to use the full potencial of those unless you are running a server

This is also not true.

There are video editing and rendering programs out there that can and will take advantage of multicore processing.

Link to comment
Share on other sites

17 hours ago, GnarlsDarkley said:

I don't think so.

They even developed their own particle effects to not rely on Nvidia anymore

RTX is about realistic light/shadows/reflections, not particles or physics. And actually RTX (or better said DXR) is integrated to DX12, it's just that only Nvidia 2XXX series supports it by using a custom hardware to have a performance boost (yet in a technical way it can be used in any GPU).

Link to comment
Share on other sites

There are two things that will decide if it becomes common imho.  First is the cost.  Sad part is that the cost is still above mainstream gamers.    The new 2060 is at $350 (good luck with that, highly unlikely any will be that price until fall or winter, the gouging will be strong).   That MSRP is the same price range as a 1070.  Most gamers spend less than $250 on a GPU.    Second is that they make it so that it isn't totally proprietary.  Other vendors either have to have access to it, or their own version that can work in the same manner to make it simple for programmers.  AMD is supposed to have a similar system in the works, if not already developed.  If the Nvidia version and that are functionally more or less the same, and developers can deal with them with minimal issues I think it will happen.    Making something proprietary and cutting off a segment of your market is a bad idea.  That sort of stuff is for consoles.

Link to comment
Share on other sites

RTX technology is milion light years away from what is actually warframe. Do you really think it would add any value to the game in term of experience?
When Devs, even doing good job actually let us for winter holydays with an orb heist killing even high end computer frame rate du to some optimisation issue/bugs?

This is tippically the things that would consume time for no result at the end. Mostly for an option actually available only if you spend 800 bucks in a new GPU. What's the actual point doing that from a dev team perspective?

From my point of view, I can achieve 144 constant FPS with my actual 970 SLI, If I had to change it, what would be the point to go for a 2050? If I want a real performance jump I would have to go for a 2070, which is actually not an investment I feel usefull just to get that RTX option. Things that will end up disabled anyway because Gamers usually aim at the best FPS/performance/looking ration.

Usually putting all that kind of option "on" just killing the experience du to lack of optimisation. I'd rather prefer having a constant 144FPS matching my monitor refresh rate than wonderfull usless effects at 60 inconsistaent FPS.

Not everybody like to "waste" money for thoses gimmicks that offer nothing usefull. In my case Money isn't an issue, I just don't like to feel robbed buying the last super hyped smartphone simply because it's supposed to have a better camera.... same goes with Comp Hardware. If I feel some technology interesting, i'm going for it without looking at the price. But actually, I feel thoses effect more an annoyance than a real evolution in term of gaming.

Ultrarealism is good for movies. For multiplayer game, you end up switching off thoses kind of effects (like most are doing with film grain, bloom, lights blinding you all the time...) because thoses effect ends up simply burning more your eyes and make you tired faster.


Last, seeing average people configuration when i'm going for random parties. I hardly doubt they have good hardware enought to turn of 75% of the option on, soo, RTX ... well... Rip.


Warframe never had been in the "use the latest super effect available only with last 1000 buck GPU" run, I hardly think I will change anytime soon.


One more time, just personal opinion, but when a game have to advertise his quality only because it's using a technology that nobody else is using yet, it sound in my head like "hey our game is bad in term of gameplay, but look, as compensation, we have nice looking effects", You know, thoses marketing tricks to prove the game is good, even if it's totally boring and lack of content.

Edited by N2h2
  • Like 1
Link to comment
Share on other sites

1 hour ago, neo3587s said:

RTX is about realistic light/shadows/reflections, not particles or physics. And actually RTX (or better said DXR) is integrated to DX12, it's just that only Nvidia 2XXX series supports it by using a custom hardware to have a performance boost (yet in a technical way it can be used in any GPU).

The point wasn't the particles, the point was not relying on Nvidia.

  • Like 1
Link to comment
Share on other sites

hace 4 horas, DarkKnight271 dijo:

This is just wrong.

A GTX105ti is a really good budget card but it cannot "play any game with max graphics". The GTX1070 should be able to play most games at max graphics, but the 1050 only has 4GB of RAM so there are a few games out there where it can't load the textures and can't play the game at max graphics. The 1070 is the first card in the nvidia line that has 8GB of RAM across the board (The GTX1060 has a 6GB version that should do fine for most games at max graphics as well) and should have enough RAM to load the textures into the graphics card. The GTX1080ti and RTX2080ti both have 11GB of RAM but for 99% of uses, 8GB should be enough to run a game at max graphics settings.

the question is, do you really need that much memory for textures, cuz i dont, and even if the game needs it , you really cant tell the difference between a game that usess 3.5-4 GB or Vram or a game that uses more than that.

lets take for example titanfall, game is texture heavy sure but with 4 GB you really dont notice any lack of quality in textures

Link to comment
Share on other sites

On 2019-01-08 at 2:56 PM, Toppien said:

the question is, do you really need that much memory for textures, cuz i dont, and even if the game needs it , you really cant tell the difference between a game that usess 3.5-4 GB or Vram or a game that uses more than that.

lets take for example titanfall, game is texture heavy sure but with 4 GB you really dont notice any lack of quality in textures

The answer in some cases is, YES.

Lets take Fallout 4 for instance. If you don't have a graphics card with 8GB of RAM you can't run the game at max graphics. And yes, that's at 1080 resolution. It literally won't play because the game can't load the graphics into VRAM. I think the best you can get is medium level graphics. There are other games as well where this is the issue. And while some people may not be able to tell the difference between medium and ultra level graphics I think most people can.

So if you're looking to future proof your purchase and make sure you're going to be able to run games for a longer time, make sure you get a graphics card that has at least 6GB of RAM, or if you can afford it, 8GB of RAM. If you don't mind running games at medium and lower graphics settings then a graphics card with 4GB of RAM will be fine.

So, to restate it, a GTX1050ti is a good budget card but it can't "play any game with max graphics" like you so incorrectly claim.

Link to comment
Share on other sites

hace 4 horas, DarkKnight271 dijo:

The answer in some cases is, YES.

Lets take Fallout 4 for instance. If you don't have a graphics card with 8GB of RAM you can't run the game at max graphics. And yes, that's at 1080 resolution. It literally won't play because the game can't load the graphics into VRAM. I think the best you can get is medium level graphics. There are other games as well where this is the issue. And while some people may not be able to tell the difference between medium and ultra level graphics I think most people can.

So if you're looking to future proof your purchase and make sure you're going to be able to run games for a longer time, make sure you get a graphics card that has at least 6GB of RAM, or if you can afford it, 8GB of RAM. If you don't mind running games at medium and lower graphics settings then a graphics card with 4GB of RAM will be fine.

So, to restate it, a GTX1050ti is a good budget card but it can't "play any game with max graphics" like you so incorrectly claim.

i played arkham knight at full graphics with a 760, at 50 fps the 1050 ti is better

i can play witcher 3 max graphics, at 50 FPS with the 1050 ti, overcloked my processor (its a core i5 3th or 4 th generation i dont remember XD) a little and now runs at 60 fps even at the big city (novigrad is called i think) and that is the fire test for that game

i can play titanfall 2 at max graphics with the 4 gigs or vram budget, at 60 fps no visible lack of quality in textures (unless you want to really sniff at those walls XD)

SC2 i can play the co-op maps at 60 fps, going to 45-40 on really heavy fights

i could say i play warframe at max graphics and doom, but those two are really good optimized, so they dont count

dont give me the bs fallout 4 needs a powerfull graphics card with alot of VRAM cuz bethesda games have S#&$ engines with bad optimization since skyrim, with dlc that have really high resolution textures? yeah sure, but their "high quality texture" dlc is S#&$ compared to what any modder does with the same resolutions and are less VRAM heavy

and if you want your harware to last longer, overclocking is the solution for it XD, just make sure you have a good enough coolant system and dont overvoltage too much, i know i made my 650ti last so much as to play, well not max graphics, but kinda medium-high graphics, until 2016, where i had to buy the 760, but my 760 was just a really poor designed card so it burned, so i bought my 1050 ti and ive never been more happier 😄

Edited by Toppien
Link to comment
Share on other sites

3 hours ago, Toppien said:

i played arkham knight at full graphics with a 760, at 50 fps the 1050 ti is better

i can play witcher 3 max graphics, at 50 FPS with the 1050 ti, overcloked my processor (its a core i5 3th or 4 th generation i dont remember XD) a little and now runs at 60 fps even at the big city (novigrad is called i think) and that is the fire test for that game

i can play titanfall 2 at max graphics with the 4 gigs or vram budget, at 60 fps no visible lack of quality in textures (unless you want to really sniff at those walls XD)

SC2 i can play the co-op maps at 60 fps, going to 45-40 on really heavy fights

i could say i play warframe at max graphics and doom, but those two are really good optimized, so they dont count

dont give me the bs fallout 4 needs a powerfull graphics card with alot of VRAM cuz bethesda games have S#&$ engines with bad optimization since skyrim, with dlc that have really high resolution textures? yeah sure, but their "high quality texture" dlc is S#&$ compared to what any modder does with the same resolutions and are less VRAM heavy

and if you want your harware to last longer, overclocking is the solution for it XD, just make sure you have a good enough coolant system and dont overvoltage too much, i know i made my 650ti last so much as to play, well not max graphics, but kinda medium-high graphics, until 2016, where i had to buy the 760, but my 760 was just a really poor designed card so it burned, so i bought my 1050 ti and ive never been more happier 😄

Lol max graphics and 50fps in the same sentence.

most budget graphic cards can handle max settings in games, but with terrible frame rates.

a decent framerate is always 60+, a good one is 100+. You can’t consider max graphics without taking into account the framerate they run at.

also I consider max graphics the one that run atleast in Qhd, FHD is an age old resolution.

and 1050ti can’t really handle Qhd in most games. It’s a glorified gtx 960, but its not even close enough to the previous gen gtx970 that I would consider a pretty fine GPU for playing in FHD and for some very light games even in Qhd

 

btw don’t want to ruin your happiness with the 1050ti. It’s a cool budget card and I am sure you enjoy fhd content on it, and you should.

just pointing that there’s a world out there that does consider 50-60fps in fhd very very far from max graphics

Edited by JohnKable
  • Like 1
Link to comment
Share on other sites

As of now RTX is little more than a fancy but impractical tech trick that is way too resource-intensive in proportion to the effect it has on graphical fidelity.

Link to comment
Share on other sites

DX12 and Vulkan have raytracing interfaces which can be used for that. 
Whenever it comes, raytracing would come with a vendor-neutral interface. 

And it's not a priority now, as it will be an extra coat of shininess with a super heavy performance hit, and only useable in a small portion of super high end systems. 

DX12 has much more than raytracing though, and will benefit the performance of the game. 

I think DX12  should come in 1-2 years for the performance advantages, and raytracing specific graphic improvements afterwards.

Link to comment
Share on other sites

20 hours ago, Toppien said:

i played arkham knight at full graphics with a 760, at 50 fps the 1050 ti is better

i can play witcher 3 max graphics, at 50 FPS with the 1050 ti, overcloked my processor (its a core i5 3th or 4 th generation i dont remember XD) a little and now runs at 60 fps even at the big city (novigrad is called i think) and that is the fire test for that game

i can play titanfall 2 at max graphics with the 4 gigs or vram budget, at 60 fps no visible lack of quality in textures (unless you want to really sniff at those walls XD)

SC2 i can play the co-op maps at 60 fps, going to 45-40 on really heavy fights

i could say i play warframe at max graphics and doom, but those two are really good optimized, so they dont count

dont give me the bs fallout 4 needs a powerfull graphics card with alot of VRAM cuz bethesda games have S#&$ engines with bad optimization since skyrim, with dlc that have really high resolution textures? yeah sure, but their "high quality texture" dlc is S#&$ compared to what any modder does with the same resolutions and are less VRAM heavy

and if you want your harware to last longer, overclocking is the solution for it XD, just make sure you have a good enough coolant system and dont overvoltage too much, i know i made my 650ti last so much as to play, well not max graphics, but kinda medium-high graphics, until 2016, where i had to buy the 760, but my 760 was just a really poor designed card so it burned, so i bought my 1050 ti and ive never been more happier 😄

Dude, all the games you mentioned are at least 3 or 4 years old. StarCraft 2 is 8, going on 9, years old. I would hope that a 2 year old graphics card, even a budget card, could play those games well. But you're not even getting 60fps on those older games? So that means that it couldn't run those games at 120fps or 144fps and probably loses a lot of performance if you have two monitor or more hooked up to your computer. But, like I've been saying, the GTX1050 ti is a budget card. It's not bad but your claim that it can "run any game with max graphics" is just wrong. Fallout 4 is an example of this, despite what you think about the company behind it Fallout 4 is a game that the GTX1050 ti can't run at max graphics and I'm sure there are more that I'm not remembering.

Link to comment
Share on other sites

On 2019-01-08 at 7:46 AM, neo3587s said:

RTX is about realistic light/shadows/reflections, not particles or physics. And actually RTX (or better said DXR) is integrated to DX12, it's just that only Nvidia 2XXX series supports it by using a custom hardware to have a performance boost (yet in a technical way it can be used in any GPU).

Thanks, I was just about to rant about DXR. But this is what Nvidia does to the ignorant consumers who 'Just buy it!'.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...