So I wasn't gonna post in this thread again; I even had unfollowed it.
HOWEVER, one of my clanmates remarked on it, so I wanted to put the issue to rest. Skip to the very last section past all the pictures and stuff if you just want the TL;DR.
The thing is, it's not about "lower-end machines."
One of my best real-life friends is a guy named Zak Killian. He writes for a website called The Tech Report.
TechReport more or less invented the modern method of benchmarking games using frame times instead of simple average FPS.
TechReport is the reason that, when you go to LinusTechTips or GamersNexus, you see 99th-percent or 95th-percent measurements.
You can read the original "Inside the Second" article here, where Scott Wasson talks about how average FPS tests don't tell the whole story.
I enlisted his help to help ME show YOU the reality of what anisotropic filtering (AF) does to performance in modern, open-world games.
This attitude that anisotropic filtering "doesn't have a performance hit" is a really common one and it's just wrong.
It's always been wrong, but particularly in the modern age where the triangle count of games has grown so fast, it's even more wrong.
I think this attitude may arise from the fact that AF is much less demanding than classical multi-sample anti-aliasing, and the two are often conflated.
To test performance what we did was ride a K-drive around the back side of Fortuna and through the woods on the new Plains.
Testing was done on Zak's machine, which is a Core i7-8700K with a GeForce GTX 1080 Ti. Not exactly a "lower-end machine."
We felt that it would be too difficult to reproduce a combat benchmark, so we stuck to the simple, easily-reproducible K-drive ride.
If anyone wants to know the exact routes I can produce videos that demonstrate exactly that.
While riding, we recorded gameplay with the Open Capture & Analysis Tool.
This is a free and open-source tool that measures game performance, then spits out a CSV file with the data.
OCAT can also generate simple graphs of the data, so I'll present those graphs for you here:
The image above graphs frame times for our test runs in the Plains of Eidolon.
Our test run starts from the Cetus door and heads roughly north-west, following the road-like paths.
You will note that enabling 16x anisotropic filtering in the Nvidia control panel produced even worse performance than stepping up to 4K.
For the 8x AF tests, we used the in-game option, and left the Nvidia control panel at "Application-controlled."
This second image graphs frame times for our test runs on the Orb Vallis.
Our test run starts at the Fortuna gate, circles around the left side, goes around the back, and then continues slightly past Fortuna on the north side.
Once again, enabling 16x AF in the Nvidia control panel produced worse performance than increasing the resolution 67%.
Let me say that again: doubling the anisotropic filtering samples hurt performance more than a 2/3 increase in resolution.
The loss of performance is bad enough, but you'll also notice that in both cases the red line has more ups and downs than the other lines.
That implies a hard performance bottleneck somewhere, and it manifests in the gameplay experience as what we call "micro-stutter."
Have you ever been playing a game, and felt that it was running at a really low framerate, like 30 FPS, but when you look at your FPS counter it says 60+?
That's caused by micro-stutter.
Micro-stutter is NOT exclusive to SLI rigs as people once thought. It is in fact somewhat common, especially on laptops. Lots of things can cause it.
The most common culprit is power-saving measures, but another thing that can cause micro-stutter is when you have a major performance bottleneck.
I touched on this earlier in the thread, but enabling such a high degree of anisotropic filtering is overworking your GPU's texture units.
Large portions of the work of rendering in modern 3D games are not done on programmable processors. They're done on fixed-function compute units.
The two most important ones are the Texture Units (TMUs) and Render Output Units (ROPs).
Your GPU only has a certain, fixed number of these types of compute units. That means its ability to do these things is fixed at a certain rate.
It's popular to think of the GPU itself as a pipeline.
Data comes in, gets mangled by various types of functional units, and then finally gets shoved into memory by the ROPs.
This happens sequentially, so if you overload any single type of work, the whole process is going to slow down. When you enable 16x AF, you're causing that part of the GPU workload to skyrocket in difficulty.
That makes the whole process slow down as a result, which drops your framerate.
If you want to check our work, you can download a 7-zip archive that has the test data and Nvidia driver profiles used for testing.
With all of that said, I do want to take a moment to eat a little humble pie.
In my earlier posts I insisted that the difference in 8x and 16x AF would "never" be obvious.
I do stand by that statement with regard to the outdoor, open-world areas. The natural environments don't need that much filtering.
Buuuut, my reviewer friend chided me for being so dismissive of higher AF levels, since the difference CAN be pretty obvious.
In particular, Warframe includes a lot of indoor areas with long hallways that use detailed repeating patterns on the floor.
These kinds of environments are the perfect showcase for anisotropic filtering. Here's some example images I took myself on the Uranus underwater tileset:
Sorry that they are not from exactly the same angle. I couldn't get there because of the stupid box in the way.
I'm sure you can guess that the second image has 16x anisotropic filtering forced in the Nvidia control panel.
There's very little visual difference in the images aside from the distant floor pattern being quite a lot clearer.
It certainly is possible to see the difference here if you're looking for it.
That's the key though, isn't it? You do sort of have to be looking for it. I mean, I don't know—maybe it stands out strongly to some people.
I want to stress that this is an absolute worst-case scenario for the texture filtering. Most hallways in the game are not quite this long and flat.
Personally, I don't think the difference is worth the performance hit. I've been using 8x AF in Warframe (and everything else) for a long time.
Another argument that people have brought up in this thread is that it's wrong to remove the option.
I've said this a couple of times in the thread already, but ultimately in principle I do agree. The option should remain, as far as I'm concerned.
My argument is simply that DE is justified in removing it. Having 16x AF enabled, as we've shown, causes a massive performance hit outdoors.
Consider that this game has millions of users.
If even 0.1% of people foolishly slam all the settings to the top, you're talking several thousand people making forum posts and support tickets.
It's a whole lot simpler to just remove the option. "Save people from themselves," as Steve put it.
To finish off my really, truly final post in this thread, here's what my aforementioned reviewer friend Zak had to say on the topic:
TL;DR—DE is justified in removing the option, so just force it in your driver control panel if you really need 16x AF.