Future 2 Benchmarked: 30 GPUs Examined

Opened a few days ago to those who pre-ordered the game, today we have a beta version of Destiny 2 on hand for a bunch of GPU benchmarking. While PC retailing won't be released until late October, this seemed like a great opportunity to see what kind of hardware the game is likely to need.

We understand that performance may change at the time the title is launched. However, it's worth noting that AMD and Nvidia released driver updates about a week ago to better support Destiny 2 before beta, and that both Radeon and GeForce graphics cards appear to be working fine so far.

Our Core i7-7700K test system was clocked at 4.9 GHz, although we as a spoiler found that Destiny 2 works well with cheaper processors. We also looked at the Ryzen 5 1600 and will compare these numbers later with the Core i7-7700K.

We are currently focusing on 30 GPUs, 17 of which are from the current generation, the other 13 from last season. All cards were 1080p, 1440p and 4K using the & # 39; high & # 39; Destiny 2 quality tested, which is the second highest quality preset.

The default for maximum quality – referred to as "highest" – completely degrades performance because MSAA is enabled. Since I didn't notice any difference between MSAA and SMAA, I decided to use the second highest quality (& # 39; high & # 39;) preset that uses the latter anti-aliasing method.

Since you can't save your progress during the single player tutorial / intro, I only tested the first 60 seconds of the game. After running it more than 90 times, I can say with certainty that I memorized it, although this approach was a great pain as we had to quit the game after each run to reset everything. I hope that when the game is officially released, we can find a much more efficient testing method.

Also note that our plans to rate Destiny 2's multiplayer range of beta were abandoned early, as it was taking far too long to get into a game. Anyway, we come to some numbers …

Benchmark time

In the first place are the 1080p results. As you can see, there are many cards that can deliver on average between 96 and 105 fps, and many more that are suitable for 60 fps or better.

If we ignore the high-end gearbox for now, since we only have 1080p, we find that gamers need an RX 570 or GTX 1060 3GB to stay above 60 fps at all times, and frankly, these are some fairly mild requirements. Nvidia currently has a clear advantage in this title, at least at lower resolutions, and we see this when comparing GPUs like the RX 580 and the GTX 1060.

If we look at the previous generation GPUs, we see that they always have problems maintaining well over 60 fps, as the Radeon R9 390 and the GeForce GTX 970 show.

Even the R9 380 and GTX 960 do it well and never dive below 60 fps. It's interesting to see that the Fury series wasn't really ahead of the R9 390X, not to mention that it ran after the GTX 980 and was completely trampled by the GTX 980 Ti. Fine wine doesn't work here, people.

We see a larger spread of results at 1440p and again there is a large number of GPUs that can transfer over 60fps. If you are looking for 100 fps or more, you either need a high-end GPU or have to reset the set quality to medium.

Increasing the resolution to 1440p significantly reduces the margins, and now there are only three frames between the GTX 1060 and the RX 580, both of which offer excellent performance at this relatively high resolution. For an average of around 100 fps you should be looking for a GTX 1070 or Vega 56 GPU and to reach well over 100 fps the GTX 1080 Ti is required.

The Fury X series is also beginning to recover, even though the Fury X is well below the GTX 980 Ti here too. In the meantime, the GTX 970 was ahead of the R9 390, although here too it can be said that both delivered exceptionally smooth frame rates.

It's pretty shocking that even the GeForce GTX 950 managed to keep frame rates above 30 fps for a console-like feel, even though the graphics and resolution were significantly better.

Finally, those who want to play at 4K with high quality settings and an average of at least 60 fps … your options are even more limited. The best choice here is clearly the GTX 1080, but damn it, the Ti version was good at 4K in this title.

As expected, people with an older GPU would ideally want a previous generation flagship like the GTX 980 Ti or Fury X. The mid-range to high-end models like the GTX 970 and the R9 390 did well, but you will probably want more horsepower for 4K games.

Before I summarized, I threw the GTX 1080 Ti into the 7700K and Ryzen 5 1600 test systems to see how they compare. These results only come from the first 60 seconds of the single player part and I'm aware that larger open sections of the game as well as multiplayer modes will be more CPU demanding, but I can't test this until the game is officially released. so that has to be enough.

As expected, we encountered a massive GPU bottleneck at 4K and the performance is similar, although the 7700K still got better 1% lower results. When switching to 1440p, the Intel CPU offered 15% higher performance, and this margin was reduced to 27% at 1080p. The load was good on both GPUs and the game evenly distributed the load across the cores of the R5 1600. There simply wasn't enough work to justify all of these extra cores for this title.

As mentioned earlier, you can count on more accurate reporting on Destiny 2's performance when the game actually starts.

Final thoughts

Well, I have to say, although I haven't had a chance to test multiplayer action yet, I was pleasantly surprised by what I found in Destiny 2's beta experience. The game looks great, plays well, and it's already a lot more polished than most of the Triple A titles I've tested last year, so beware of the developers at Bungie.

For single players, a medium-range GPU at 1080p with high quality settings offers more than just coverage, and many cards even offer smooth 60 fps at 1440p. AMD's GPUs lag well behind at 1080p, but come alive at 1440p and 4K, where they were very competitive.

Speaking of 4K: Did you see that the GTX 970 with the R9 390 ran at an average of 37 fps at this resolution from head to toe? Wasn't the 970 and its poor 3.5 GB frame buffer out of date and a trash heap? Maybe I messed up my data … but the extremely popular Maxwell-based GPU is still going well, as is the R9 390.

Further down the food chain, the R9 380 made its way with the GTX 960, while the GTX 950 shipped the R7 370 with ease. At the other end of the spectrum, the GTX 980 Ti again proved to be far too powerful for the Fury X. X.

If you're lucky enough to own a GTX 1060 or RX 580, you should be able to take advantage of Destiny 2 on current-generation GPUs. For the single player part of the game, everything is fine as long as you have a CPU that is better than an FX series or an old Core i3. All Ryzen CPUs have held their own, as have the newer Intel dual-core chips with hyper-threading.

The latest Core i3s will have an occupancy rate of around 80%, and so did the Pentium G4560, while the Ryzen 3 1200 was around 60-70% and the Ryzen 5 1400 was only around 50% full.

When it came to memory, we only saw about 2.3 GB / s VRAM at 1440p and this explains why the 3 GB GTX 1060 performs so well together with the GTX 970. Even with 4K, we only reach the limits of the 3 GB buffer, but of course these GPUs run out of computing power before memory buffer capacity becomes a problem. As for system memory or memory, they seem to be covered with 8 GB, since the game occupies about 5 to 6 GB.

Purchasing links:

Overall, Destiny 2 is developing into a great looking game that seems to be well optimized for the PC. Now all we have to do is wait for the retail version to arrive in a few months to take a closer look at multiplayer performance on enthusiastic hardware.

Leave a Reply

Your email address will not be published. Required fields are marked *