The long-standing rally game from Codemasters has returned in the twelfth round and received largely positive reviews on PC, Xbox and PS4 last Tuesday. With updated drivers from AMD and Nvidia arriving just a few days later, it seemed like an ideal time to see how the title works on current and previous generation GPUs.
AMD's Crimson ReLive Edition 17.6.1 driver offers up to 30% performance improvement when using 8xMSAA over the previous driver for the RX 580. None of my tests will use higher than 4xMSAA because the Ultra preset allows this , but you can expect better performance across the board with this driver. As for Nvidia, its 382.53 WHQL driver claims to offer an optimal gaming experience for Dirt 4, although given the performance we recorded, a "more optimal" driver is probably already in the works. We'll look at the numbers shortly, but as a bit of a spoiler, we can say that the red team has an advantage in this title for now.
We only focus on current and previous generation graphics cards that have been tested at 1080p, 1440p and 4K. Please note that the Radeon RX 480 and 470 have been dropped as they are similar to the newer RX 580 and 570. Even without these cards, we tested a total of 27 GPUs, so you should get a solid reference point.
All tests were carried out with our usual Core i7-7700K system, which was clocked at 4.9 GHz. We decided to skip the CPU scaling results for this title because Dirt 4 doesn't require a powerful CPU. In some internal tests, we found that a modern dual-core CPU with hyper-threading like the Pentium G4560 works well. The game heavily loads a single thread and uses a maximum of two threads efficiently. In addition, on most modern quad-core or higher CPUs, additional threads are loaded with 20% or less.
That doesn't mean the game is poorly optimized, just that it isn't CPU-intensive. This is not an Arma 3 scenario in which a single thread is pounded and the CPU is still the performance limiting factor. In other words, I don't see a CPU bottleneck in this game. Of course, it would be better if the EGO 4.0 game engine distributed the load more evenly, but it's not a problem for a game like Dirt 4 that is mostly GPU-bound.
For those of you wondering, I saw almost no difference in performance when I used the 1080p GTX 1080 Ti with either the Pentium G4560 or a heavily overclocked Core i7-7700K. The CPU really makes little difference here. The Ryzen 5 1400, 1600 and 1700 also performed similarly and were all able to approximately maximize the 1080 Ti.
With a GeForce 10 series GPU, the ultra-quality game uses almost 4 GB of VRAM at 1080p, about 4.5 GB at 1440p, and about 6 GB at 4K. When using an AMD Radeon RX 580 8 GB, these numbers were slightly inflated and reached 4.5 GB at 1080p, slightly more than 5 GB at 1440p and 7.5 GB at 4K. This is proof that Nvidia's memory compression technology is somewhat superior.
When it comes to using system memory, the game typically consumes ~ 4-5 GB depending on the configuration. An exception is the use of a graphics card with a limited VRAM buffer at resolutions that cannot be processed. In any case, you should be covered with 8 GB of RAM. That's about all we need in our prelude to Dirt 4's graphics performance …
Test system specifications and memory
We immediately see that the GTX 960 lags behind the R9 380 by 17%, while the GTX 970 was 31% slower than the R9 390. Although the GTX 980 Ti was a little ahead of the R9 390X when comparing the average frame rate, it was comparing the minimum by 5% slower.
Meanwhile, the Fury X pulled the head of the GTX 980 Ti and 1070 at 1080p at an average of 96 fps, making it 20% faster and 30% faster compared to the minimum frame rate. Even in relation to the GTX 1080, the Fury X looked good with only 10 fps slower. Of course, the Titan XP and GTX 1080 Ti were faster again, but you'd expect it, of course.
High-end Nvidia GPUs suffer from a minimal frame rate problem that is strange but worth keeping an eye on.
If you jump to 1440p, you will see very few GPUs that can reach 60 fps, and this changes things pretty drastically as the Nvidia GPUs come back into play now. The GTX 1060 is now not much slower than the RX 580 or at least compared to 1080p. The GTX 1070 is ahead of the RX 580 and offers slightly better performance with a minimal frame rate and a much stronger average.
It was almost like we saw a problem with driver overhead for Nvidia at 1080p, wouldn't that be ironic? Below we see that the GTX 960 is only one frame slower than the R9 380. However, the GTX 970 is still slightly slower than the R9 390 and is 25% behind it.
The GTX 980 Ti can approach the Fury X, although it is still a little slower when comparing the minimum frame rates. The minimum of 46 fps of the 980 Ti means that it can only keep up with the Nano and the 390X.
After all, at 4K, it's just the GTX 1080 Ti that crosses the 60 fps limit. I should note that the Gigabyte Aorus GTX 1080 Ti was used for testing, while the Titan XP is obviously a reference card.
Quality preset comparisons
Lastly, we ran some quality presetting tests that were done on the GTX 1060 and RX 580 at 1440p. While the GTX 1060 was almost 30% slower with the ultra quality preset, it reduced that leeway to 11% slower at high games quality. When switching to medium, the 1060 managed to advance by about 10%, and this was also observed when using the presets for low and extremely low values.
With the high and ultra presets, something is clearly activated or surface that hurts Nvidia more than AMD.
Before you summarize things, here's a quick look at the different quality presets that were recorded side by side with the Titan XP at 4K.
AMD performs well in Dirt 4, especially at minimal frame rates. That said, if you're ready to lower the quality preset from Ultra to High or Medium, Nvidia will catch up and even gain an advantage at the lower settings – again, we're not sure what hinders the GeForce cards at higher grades.
When reducing the settings from ultra to high, the GTX 1060 saw a massive 60% increase in performance, while the RX 580 improved 40% – these are some fairly large numbers for a slight loss in visual quality if you ask us. If you run the game at high speed, both cards could average well over 60 fps at 1440p. So if you're not using extreme hardware, we recommend using the high-quality preset in favor of Ultra, although this applies to almost all modern games. As usual, vegetation is what completely degrades performance when set to Ultra. Low vegetation courses ran with a medium range graphics card at over 70 fps, and this number was often halved with heavy vegetation (as I tested).
<iframe allowfullscreen = "" frameborder = "0" height = "390" src = "https://www.youtube.com/embed/jdBpSHpxXWM?rel=0&showinfo=0&modestbranding=1&vq=hd720&autohide=1&autoplay=1" width = 560 "load =" lazy "srcdoc ="
When it comes to CPU performance, the game really only uses one or two threads, and although that sounds bad, this isn't a game that takes a lot of resources to run well.
If you have a modern graphics card, you can expect Dirt 4 to perform smoothly, especially with high quality, although the playable frame rates on Ultra are by no means out of reach.