We almost lost the number of graphics cards we dropped for this test, but let's say there were about sixty GPUs from both camps to test Borderlands 3 performance. For testing, we run the game in DirectX 11 mode, which is generally smoother than DX12, which is still under development for this title.
Borderlands 3 seems to be banging the CPU quite a bit, and we suspect the DRM is responsible for it. Many quad-core owners complain about terrible stuttering. We used our Core i9-9900K test bench for testing, which is clocked with 5 GHz and 16 GB DDR4-3400 memory. After some research, we decided against using the built-in benchmark and generally try to avoid canned benchmarks if we can. Instead, we use the Propaganda Center for our test, although the performance appears to be close to the numbers you get with the benchmark tool.
With that in mind, if you want to get the best possible performance out of Borderlands 3 with a slight visual hit, check out Tim's excellent optimization guide on YouTube. With custom settings, he was able to increase performance by about 50% without changing the image quality compared to the Ultra preset.
Some additional test notes. We use the Game Ready 436.30 driver for the GeForce graphics cards and the Adrenalin 2019 Edition 19.9.2 driver for Radeons. Both are optimized for Borderlands 3. Therefore, make sure that you are using the latest driver version. Our benchmark was run with the Ultra preset at 1080p, 1440p and 4K and then retested with some GPUs at 1080p with the middle preset. We also dumped in a bunch of older GPUs for a reason.
Starting from the top, of course, we have the RTX 2080 Ti, with an average of 122 fps pumped out, which doesn't seem too high for 1080p. The RTX 2080 and 1080 Ti were limited to around 105 fps, which is frankly the performance you hope for at 1440p in a game like this.
The RTX 2070 Super was 9% faster on average than the 5700 XT, which is a fairly typical margin and a good result for the Radeon GPU, considering that the 1% low result was slightly better for the 5700 XT as well .
A little further down we see that the standard 5700 reaches an average of 80 fps in our test, which on average makes it slightly faster than the GTX 1080 and much better when you compare the 1% low data. Unfortunately, we were unable to test the RTX 2060 Super or the 2080 Super because we have to test these cards separately to test ray tracing for content that will be available soon.
Compared to the RTX 2060 standard, the Radeon 5700 was 8% faster, so not a big margin, and that means that it should be comparable to the 2060 Super. As was to be expected, Vega 56 matched the GTX 1070, while Vega 64 was a bit faster and was directly behind the 1070 Ti.
The GTX 1660 Ti does well here and basically fits the GTX 1070, while beating back the old GTX 980 Ti, the flagship of some generations. The Vanilla GTX 1660 makes it difficult for the RX 590 to deliver 12% more frames, and that's pretty good as it's only about 10% more expensive. The GTX 1660 averaged over 55 fps in our test, while the RX 590 fell below 50 fps.
Even the 3 GB model of the 1060 with 1080p makes its way to the GTX 1060 and RX 580 with almost 50 fps. The old GTX 970 is fine at 41 fps and we only see 39 fps from the RX 570. However, the GTX 1650 is the limit here, and even then you want to optimize the quality settings for better performance.
The 1440p results show a similar pattern, which now starts with the RTX 2080 Ti at an average of 90 fps. It's a performance comparable to a title like Assassins Creed Odyssey and much worse than The Division 2, Shadow of the Tomb Raider and Metro Exodus, to name just a few.
The standard RTX 2080 was 21% slower and delivered an average of 71 fps. With the 2070 Super we see a further drop in performance of 7%. This time the 2070 Super was 11% faster than the 5700 XT, but again, this doesn't really justify the 25% price increase. The standard version 2070 and the non-XT version of the 5700 were comparable in terms of performance and pushed the frame rates into the mid-1950s. Then we find the GTX 1080 and RTX 2060 exactly at the threshold of 50 fps.
If you drop below 50 fps, you'll find GPUs like Vega 56 and 64 as well as the GTX 1070 and 1660 Ti. Once we choose the GTX 1660, you really need to reduce the quality settings to get better performance. Many of these GPUs had an average speed of 25 to 30 fps, and this is obviously not an ideal situation on the PC.
If you jump to 4K hammering performance as expected, even the RTX 2080 Ti can't reach 60 fps on average. With Tim's optimizations, the 2080 Ti should achieve an average of just over 70 fps in our test. Maybe we would have done more benchmarking with optimized graphics settings, but by the time we got the information, we were already testing 50 GPUs.
This also means that with optimized quality settings with the GTX 1080 Ti or RTX 2080, you should be able to push close to 60 fps at 4K.
Medium settings for budget GPUs
At the end of the tests, we ran some GPUs with the default medium quality, while adding more budget and older GPUs to the mix. Here we see a massive 80% performance improvement for the GTX 1660 when switching from Ultra to Medium. The GTX 1060 recorded a 75% increase in performance and now achieves an average of 82 fps at 1080p.
The Radeon RX 570, with excellent value for money, saw a massive 109% increase in performance when the quality preset was downgraded from Ultra to Medium. We also got a big 78% increase for the R9 390, although it wasn't as breathtaking as the 570's jump. We see some old GPUs like the GTX 970 that do well with the medium quality preset. The R9 290 hangs comfortably in it with an average of 67 fps.
There's a relatively new GPU that we haven't mentioned yet, and that's because it was too slow with the Ultra Quality settings, the GTX 1650. At medium settings, the average is 65 fps, which is decent, although the RX 570 was almost 30% faster for less money, which is our usual recommendation in favor of the budget radeons.
Still, the GTX 1650 was much faster than older high-end GPUs like the R9 280X and GTX 770. The oldies still average 50 fps at 1080p, so you're in a fully playable area, which is great. There were many GPUs in this performance class, e.g. B. the R9 380X, 7970, 1050 Ti, 680, 1050, 380 and 680.
For an average of 40 fps or better, you only need the Radeon HD 7950 or GTX 960.
How Borderlands 3 works when launched with a range of GPUs with ultra and medium quality settings. It's a shame that the quality presets in this title are so bad. You can achieve a lot by manually adjusting the settings. This leads to an increase of about 50% compared to the results of the highest quality shown here.
As we noticed when the feature was opened, quad-core CPUs seem to be fully utilized in this title, which we have to investigate further. The memory also requires 16 GB of RAM for smooth performance. Borderlands 3 often found a system load of around 7 GB, with the game itself consuming around 3 GB.
For the graphics memory, 4 GB for 1080p are fine. The 3GB GTX 1060 did well in our tests, but we have a high-end system that may mask performance issues with this configuration. We even got through with 14 GB at 1440p. Of course, you need at least 6 GB of VRAM for 4K gaming, although GPU performance is becoming a bigger issue at 4K, and high-end GPUs nowadays have that much memory, so this shouldn't be a problem.