Jump to content
Dante Unbound: Share Bug Reports and Feedback Here! ×

Add a "Texture Downscale multiplier"


Doraz_

Recommended Posts

basically, it would be a godsend to be able to tell the engine to only load the 2nd, 3rd or 4th level mipmap of a texture.

 

Less memory used and less work the CPU has to do.

 

In Unity this works flawlessly, so does in all other game engines that actually took the time to put this step in the reading phase of a Texture, instead of loading the entire thing by default so the RAM is always used at max, while instead reducing only what is sent or stored in the GPU.

 

Not essential, but I would love it immensly.

Link to comment
Share on other sites

As an addendum. that way you can even give a new options to us that say " load entire level at start "

 

basically, instead of things being instantiated, along with their new DX12 shaders after the first initialization of an object, moderns CPUs are so fast that all that work can happen at the start, thus achieving an experience once it's done that to call "smooth" would be an uinderstatement.

 

If everything takes less memory, you can load it all at once!

Link to comment
Share on other sites

7 minutes ago, Doraz_ said:

As an addendum. that way you can even give a new options to us that say " load entire level at start "

 

basically, instead of things being instantiated, along with their new DX12 shaders after the first initialization of an object, moderns CPUs are so fast that all that work can happen at the start, thus achieving an experience once it's done that to call "smooth" would be an uinderstatement.

 

If everything takes less memory, you can load it all at once!

DX12 shader compilation is often done in the launcher as a side-task with updates. I'm not sure if this is implemented yet.

As for a global texture scale control, that seems like it might be too granular and too confusing for less technical players for it to be included in a standard menu and seems like the sort of thing that a "texture memory" setting should be for so that the engine can manage for the target, but I can certainly advocate for exposing advanced engine options in their own menus.

Link to comment
Share on other sites

You know this already for sure, but anyone else, the slowdowns are proportional to the number of pixels being present in an image.

For pixel art games is not an issue, but for games like WF it is.

 

All those old games, like metal slug, for sure didn't have the memory to load all at once ... so I feel safe to assume they too loaded things as they were needed ...

BUT you didn't feel any interruption to the gameplay cuz the images were so small.

 

If someone knows more, and I am in error, and Metal Slud DID load everything at the beginning then I sorry xD ( and thanks for the correction if that's the case ).

 

I say this cuz I tried writing my own texture loading solution, based on JSON In/Out, and discovered the hard way you can't batch those operations, that are now executed one after another, in a single update but still not in parallel ( at least, i didn't know how to do it in a better way 😂😂😂 )

Link to comment
Share on other sites

4 minutes ago, BlueQuiller said:

DX12 shader compilation is often done in the launcher as a side-task with updates. I'm not sure if this is implemented yet.

As for a global texture scale control, that seems like it might be too granular and too confusing for less technical players for it to be included in a standard menu and seems like the sort of thing that a "texture memory" setting should be for so that the engine can manage for the target, but I can certainly advocate for exposing advanced engine options in their own menus.

I remember vaguely hearing Steve investigating runtime compilation, and blaming it for slowdowns.

It's not impossible ... it's just a matter of how to go about it based on your expereince .... I would fake it, having a library of pre-computed stuff, but there goes file size, build times and shader variants.

 

 

For the textures, I was so happy they implemented DXT compression, the download size is "almost" perfect 💓

Problem is that they are still using a system that just sets a max target to aim at, without capping the max quality of a texture.

 

From my testing, if something is close to the camera, it's gonna get loaded IN ITS ENTIRETY! ( can't miss the texture of our floofs now can't we ... ) 

Link to comment
Share on other sites

9 minutes ago, Doraz_ said:

For the textures, I was so happy they implemented DXT compression, the download size is "almost" perfect 💓

And with that texture quality went to sh*t, sadly

I rather take if they implement Resolution scale above 100% but why bother when we have DLSS

Link to comment
Share on other sites

Just now, Myscho said:

And with that texture quality went to sh*t, sadly

I know ... I'm totally ok with being dismissed in my request, as most people only care about the quality of the things on screen.

With time I'll even be more and more wrong, as less and less people will have a problem with 100 GB downloads

( you know ... allegedly .... if things get better :/   ) 

 

Plus Warframe didn't go all in with the compression. It's a good compromise .... it can get waaay worse that what WF did, i can assure you

Link to comment
Share on other sites

On 2022-07-23 at 5:35 PM, Doraz_ said:

I remember vaguely hearing Steve investigating runtime compilation, and blaming it for slowdowns.

It's not impossible ... it's just a matter of how to go about it based on your expereince .... I would fake it, having a library of pre-computed stuff, but there goes file size, build times and shader variants.

 

 

For the textures, I was so happy they implemented DXT compression, the download size is "almost" perfect 💓

Problem is that they are still using a system that just sets a max target to aim at, without capping the max quality of a texture.

 

From my testing, if something is close to the camera, it's gonna get loaded IN ITS ENTIRETY! ( can't miss the texture of our floofs now can't we ... ) 

The idea of distributing the precompiled shaders for common system specs (say you submit a system code and it fetches the appropriate lib) with updates is a good one 👍

They do try to load in lower-res textures before the higher-res ones from my experience (that's what the trilinear filtering is for iirc) but I guess there's always room to improve. I use DX11 still btw.

 

On 2022-07-23 at 5:45 PM, Doraz_ said:

I know ... I'm totally ok with being dismissed in my request, as most people only care about the quality of the things on screen.

With time I'll even be more and more wrong, as less and less people will have a problem with 100 GB downloads

( you know ... allegedly .... if things get better :/   ) 

 

Plus Warframe didn't go all in with the compression. It's a good compromise .... it can get waaay worse that what WF did, i can assure you

One of the best things about Warframe is how well it runs on old hardware, and a lot of people (including me) have slightly older systems/specs and slower internet connections and appreciate it when we can access the same gameplay on our inferior equipment at the cost of some fidelity. Being able to reach a wider target audience like that is good for a game's popularity. It is also important to have attractive graphics, however, so optional texture packs like Final Fantasy does, for example, are a good way of striking a balance.

Link to comment
Share on other sites

2 minutes ago, Doraz_ said:

Plus Warframe didn't go all in with the compression. It's a good compromise .... it can get waaay worse that what WF did, i can assure you

I´d like to see that, because is pretty bad on Warframes textures, remember times before when textures was crispy, now are muddy and blurry mess and hurts my eyes especially when i use HDR monitor 

Link to comment
Share on other sites

2 minutes ago, Myscho said:

I´d like to see that, because is pretty bad on Warframes textures, remember times before when textures was crispy, now are muddy and blurry mess and hurts my eyes especially when i use HDR monitor 

wow :O

 

I'm so curious to know if it's how good as they say. I actually have one, a new hdr monitor, the desktop and internet indeed "felt more crispy" ... but I ended up disabling it for compatibility reasons the next day xD

 

Would like to know how it relates to AOLED display as well personally .... how the hek are you supposed to keep track on how it looks in a normal one, an HDR one, an AOLED ... makes my head spin.

 

going from standard to the other 2 is easy and predictable .... but the opposite happens in reverse.

There is an entire screen-pass that simulates HDR on most applications ... so if you don't disable the in-game one, you'd be doing the same operations twice!

Link to comment
Share on other sites

39 minutes ago, Doraz_ said:

I say this cuz I tried writing my own texture loading solution, based on JSON In/Out, and discovered the hard way you can't batch those operations, that are now executed one after another, in a single update but still not in parallel ( at least, i didn't know how to do it in a better way 😂😂😂 )

Interesting! So by batch texture loading you mean DX12 shader pre-compilation or do you mean loading textures during level loading in-game?

Link to comment
Share on other sites

3 minutes ago, Doraz_ said:

so if you don't disable the in-game one, you'd be doing the same operations twice!

Kinda annoy me HDR in Warframe in tied to TAA, dunno why, since TAA is pretty much garbo AA and works like post-process. I dont like how visually lokk TAA in Warframe, so i dont use it and than no go for HDR

Link to comment
Share on other sites

6 minutes ago, Doraz_ said:

wow :O

 

I'm so curious to know if it's how good as they say. I actually have one, a new hdr monitor, the desktop and internet indeed "felt more crispy" ... but I ended up disabling it for compatibility reasons the next day xD

 

Would like to know how it relates to AOLED display as well personally .... how the hek are you supposed to keep track on how it looks in a normal one, an HDR one, an AOLED ... makes my head spin.

 

going from standard to the other 2 is easy and predictable .... but the opposite happens in reverse.

There is an entire screen-pass that simulates HDR on most applications ... so if you don't disable the in-game one, you'd be doing the same operations twice!

I heard from my friend how managing HDR can be quite a hassle like this! Haha

Link to comment
Share on other sites

6 minutes ago, BlueQuiller said:

Interesting! So by batch texture loading you mean DX12 shader pre-compilation or do you mean loading textures during level loading in-game?

the second thing!

 

It was my attempt at a completely open way to load data in and out, compatible with virtually any device in existance, past  and future ones.

( But you can expand it and make it work with shaders and even code itself ... again, I'd pre-compute all possible combinations just so to avoid bugs tho )

Thing is ... it's so compatible it compensate by running like  .... very bad.

 

There are problems to resolve deep down that are just beyond me.

It works great with pixel art tho, cuz even on older system there are so few pixels to do those operations.

It's the same reason vertex operations are usually quicker.

Link to comment
Share on other sites

On 2022-07-23 at 6:06 PM, Myscho said:

Kinda annoy me HDR in Warframe in tied to TAA, dunno why, since TAA is pretty much garbo AA and works like post-process. I dont like how visually lokk TAA in Warframe, so i dont use it and than no go for HDR

Depending on the "TAA Sharpen" setting TAA can go from looking really bad to really good. Does it work for you? I think it works better at higher resolutions and framerates, so I don't use it.

 

On 2022-07-23 at 6:25 PM, Doraz_ said:

There are problems to resolve deep down that are just beyond me.

Yeah, definitely. I think we're pretty much seeing the engine being updated in real-time, so we can only wait. Sometimes I wonder what Warframe could look like in UE5, or maybe on a fork of Evolution.

Link to comment
Share on other sites

5 hours ago, BlueQuiller said:

Depending on the "TAA Sharpen" setting TAA can go from looking really bad to really good. Does it work for you? I think it works better at higher resolutions and framerates, so I don't use it.

I dont use it either because it create shimmering and ghosting on textures

Link to comment
Share on other sites

22 hours ago, Myscho said:

I dont use it either because it create shimmering and ghosting on textures

I understand, it does that for me too, but the sweet spot for me is at about 20% sharpen on 1080p. Here's hoping future updates improve the situation overall.

Link to comment
Share on other sites

  • 1 month later...

BUMPING THIS ... cuz I came back and did some testing ...

 

On me relatively normally decorated orbiter, on the Drifter's camp 

 

I HAVE 1.9 GB OF VRAM USED ... THAT'S HERESY !!! ... for a place that will be loaded in and out over and over as I do missions all over the starchart.

The power used to load all those gigabytes of texture over and over is insane.

 

By comparison, my pc's fan don't even start WHILE I'M ACTUALLY PLAYING THE GAME xD

 

For people like me that just want to play a little bit and already spent 4k+ hours looking at this game, a Texture multiplier solution would do wonders.

Levels will load in an instant ... expecially if you keep the textures in memory!!!

It will look like blurred minecraft, but at least i have a choice on what to prioritize, graphics or speed.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...