Today we're taking a look at CPU performance in Battlefield 2042 and this is possibly the toughest benchmark we've ever done. The problem with testing a multiplayer game like Battlefield 2042 is that it's extremely difficult to get accurate comparative data. Testing a hardware configuration or two isn't too difficult or time consuming … play the game for a few minutes on the same card under the same conditions, do this three times to record the average, and you'll have a pretty good idea how to compare them. It may not be an exact comparison of apples to apples, but it is certainly a standard.
But testing 20+ configurations to compare a multitude of CPUs is a daunting undertaking, and in short, I didn't do anything for 7 days in a row other than try to load into a 128-player conquest game on the same map (and successfully!). .
For testing we used the orbital map and of course this map wasn't always available in the rotation, so we had to wait until it was used again. This, along with a number of other factors, meant that I could only test 3 or 4 CPUs per day. We also integrated separate 60 second tests on the same map with bots, which is a more controlled environment as I have a fixed number of players in the game and they are all active. It's not as CPU demanding as there are fewer players and the AI load is different. But we are more confident of the accuracy of this data because it is a more controlled test.
The capture data for 128 players is based on 3 minutes of play time and since the number of players on the server can change and what the players are actually doing there is more variance but the 3 run average helps fix this. But please note that the margin of error is higher compared to our more controlled tests, and I have certainly noticed a greater variance from run to run.
It's also worth noting that the game received a patch during our testing and Nvidia released an updated GPU driver. We used the GeForce RTX 3090 for testing because it was typically faster than the Radeon RX 6900 XT in this game, as we found in the GPU benchmark a week ago.
None of these updates affected the results. We believe the Nvidia driver mainly addressed DLSS-related issues, while the game patch mainly addressed stability and bug fixes. The test was verified with GeForce Game Ready Driver 496.76 WHQL drivers using the latest version of the game. All configurations used 32GB of dual-rank dual-channel DDR4-3200 CL14 memory, and we also considered some memory results with a few different configurations for a good measure.
Let's start with the CPU tests first …
We usually test CPU performance at lower resolutions like 1080p to remove GPU bottlenecks, although it's not really necessary in Battlefield 2042 multiplayer, as you'll soon see when we test at 1440p.
But here at 1080p we see that for the best performance you need a 12th generation Intel processor, although we're talking about an 8% performance advantage for the 12900K over the 5950X. The 12600K was also 9% faster than the 5800X, although the 1% low and 0.1% low data were comparable. What is really interesting, however, is that despite the game's heavy CPU load – at least by normal gaming standards – when comparing the average frame rate, the 5950X was only 5% faster than the 5600X, and up to 14% faster for the 1st % low.
Battlefield 2042 uses 8 cores when available, so on the 5950X half of the cores did essentially nothing. In the case of Zen 3, the 8 cores are not exhausted, so that the 5590X recorded a utilization of around 30 to 40% in 128-player games.
The 5800X's load was more in the range of 70-80%. So the only reason the 5950X was a few frames faster would be because of the slight increase in frequency since things like cache capacity per CCD are the same.
The 5600X was similar, despite being a 6-core / 12-thread CPU, as the game didn't max out the 5800X, which means a fast 6-core Zen 3 processor is still fine, though it is right on the edge and the utilization is often locked to 100%. Despite this heavy load, the game didn't stutter, at least not more than with Zen 3 processors with a higher number of cores. But it does mean you're right on the edge with that part, and slower 6-core / 12-thread processors will see a drop in performance, provided your GPU is capable of going above 100 fps at your desired quality settings.
The newer CPU architectures are less about the number of cores than about the IPC that a certain architecture offers. There is a little more variance on the Intel side as the L3 cache capacity increases from Core i5 to i7 and from i7 to i9.
The difference between the 11th generation models is minimal, and we're only talking about 6 cores versus 8 cores. On the 10th generation, we see up to a 15% variation between 10600K and 10900K, and this is mainly due to the L3 cache capacity.
AMD's Zen 2 processors mistook it for 10th gen Intel, and it was good to see parts like the Ryzen 3 3600X with the Core i5-10600K. If we get closer to the Zen + parts, you can see these older Ryzen CPUs showing their age. The 2700X struggled even though it was an 8-core processor with 0.1% lows of 40 fps, and while the 79 fps average was still more respectable, the 5800X was 43% faster here.
Modern 4-core / 8-thread CPUs can still technically play Battlefield 2042, but you can expect much more noticeable stuttering than with equivalent 6- and 8-core models. What cannot deliver playable performance are 4-core / 4-thread CPUs like the Core i3-8350K, the game was essentially broken on this CPU and only caused constant stuttering.
When you look at these numbers, the most surprising part is that in large 128-player matches, even with a significant amount of processing power investing in the game, it's difficult to go over 100 fps. We'll talk more about this at the end of the article.
The 1440p results are interesting as they reflect the GPU tests much more closely, which largely lifted the CPU limits. Therefore, these results are 12th when using high-end CPUs such as Intel. The performance values for the remaining CPUs are roughly the same. For example, the Core i9-11900K fell from an average of 113 fps to 110 fps.
This explains why many Battlefield players were unable to improve performance by lowering the resolution or quality settings. They just aren't GPU-limited, but they probably aren't always CPU-limited either, as you think.
Testing with bots
Now that data is based on a custom bot match that only uses the game's AI, and compared to the 128 player results we just saw, CPU usage dropped ~ for a part like the Ryzen 5 5600X 15%. That's enough to boost GPU performance by about 30% and improve the 1% lows by a whopping 50%, although interestingly, the 0.1% lows have stayed pretty much the same, at least on the 5600X.
It's similar with older 6-core / 12-thread processors like the 2600X, where the average frame rate has increased by 20% while the 1% depth has increased by an incredible 59%.
But not only the mid- to low-end CPUs benefit massively from this lower workload. The 12900K's average frame rate increased 33% with a 94% increase with 1% slow power. It's interesting to see how just a very small difference in utilization changes performance so massively. However, the CPU usage is likely to be very different, which is why the usage figures alone can be quite misleading.
If we jump to 1440p we get GPU-bound and this sees processors starting at 10600K all delivering similar average frame rates, although 12th generation CPUs were much better considering the 0.1% low performance.
Since we're severely CPU constrained in Battlefield 2042, it makes sense that memory should play a key role in terms of performance, and it definitely does. However, when we switched to DDR5-6000 with the 12900K, we couldn't find any improvement, but rather a slight decrease in the frame rate. This is a real shame and another blow to the current state of DDR5.
Moving on to the Ryzen 9 5950X results, I installed some inexpensive DDR4-3000 CL18 single-rank memory, and here we see that the low-latency DDR4-3200 memory increases performance by 18%, with an improvement from 15% to 1%. This is a significant difference as DDR4-3000 and 3200 are similar, at least in terms of frequency. There is, of course, a huge difference between CL18 and CL14 timings.
Using the same memory configurations, we saw an improvement of up to 21% with the 8700K, and while this is a completely different CPU architecture, it makes sense that the more limited the CPU, the better quality memory can give you a boost . So, if you have an 8700K and can only run around 70 fps on a relatively high-end GPU, tweaking your memory can lead to noticeable performance improvements.
What we learned
As many gamers have noticed, Battlefield 2042 is a very CPU intensive game and exceeding 100 fps can be a real challenge. But is this a developer's mistake, is the game severely unoptimized and can it be fixed?
In our view, the problem that Battlefield 2042 is facing is that of all modern games. Yes, it is very CPU intensive, but gamers shouldn't worry about the percentage of their CPUs being used in Battlefield as other aspects of the CPU can limit performance, such as cache performance or memory latency, which are not included in the figure. If they changed the game to put more load on the CPU and increased that number on high-end CPUs, it would put the load on the CPU itself, creating performance issues on low-end CPUs. The game seems very demanding on several aspects of the CPU and optimizing for multiple areas could be difficult, but that doesn't mean it's not optimized overall.
So if you think it's bad now, many players would have no chance of getting playable performance if the game were to take advantage of modern 8 or 12 core processors, for example. If we look at the official system requirements and focus on the AMD processors (Intel's recommendations are nonsense) we see that the minimum specification is a Ryzen 5 1600 and based on the tests we have here of parts like the R5 2600X have seen what is only marginally faster, this recommendation makes sense, it is an absolute minimum.
The recommended specification requires at least a Ryzen 7 2700X and here you want a minimum, but it has to be said that while this CPU was still a bit overwhelmed, the game was perfectly playable, so the recommendation makes sense. If the developer had put more stress on the CPU, the recommended spec would become something like the 5800X, and at that point few would be able to enjoy the game.
While gamers are often quick to criticize the developer by blurting out general terms like not optimized, the truth is much more complicated. And Battlefield 2042 has a lot going for it: They doubled the number of players and that basically showed quad-core processors the door while transferring the heat to the previous generation's 6-core / 12-thread processors. The game also features an advanced system of destruction, weather effects and so on.
At this point, we don't think the CPU usage we're seeing is unjustified or suggesting that the game is poorly optimized (in terms of CPU / GPU … there have been other complaints). Could more be done to tweak the game? Probably, but would that radically change performance without compromising player numbers or effects? I doubt it.
It's a good balance between playing the game for the majority of the fan base and adding new features like increasing the number of players to make the game more exciting. You can't just do more while you need less, and I think a lot of players expected that.
Finally, if you're one of the (small?) Group of gamers who want to give Battlefield 2042 a second chance, what's the best CPU? If you're on the AM4 platform, the Ryzen 7 5800X seems like the best option, although the cheaper 5600X works fine as far as we can tell.
The 5800X is down in price and at $ 390, it offers a "better price-per-core" ratio than the 5600X. Given that and the added headroom, this is probably the way to go. For Intel owners, it depends on what you have. Anything older or slower than the Core i5-10600K, the upgrade to the 12th
Right now the Core i5-12600K looks like the perfect CPU for Battlefield 2042. Throw it at the MSI Z690 Pro-A or Gigabyte Z690 UD and you have a powerful $ 500 combo. And of course, DDR5 isn't required, in fact it's best to avoid it for now.
Speaking of RAM, since this is a very CPU-sensitive game, RAM affects performance more than other games. For those who play at 4K this will be less of a problem, but for those trying to take as many pictures as possible at lower resolutions, reducing the time and increasing the frequency will improve performance dramatically.
How much RAM do you need? Not much. 16 GB is sufficient as total system usage never exceeded 12 GB when playing Battlefield 2042 in our tests. Generally it fluctuated around 10 GB, and that was at 32 GB installed in our testbed. The only time you'll sneak over it is when you're running out of VRAM, and the game is using a lot of VRAM with the ultra quality settings, so you'll need at least an 8GB or more graphics card to play at 1440p and above.
Back to our previous GPU tests, if we look at the 1080p data, all you need is a GeForce RTX 3060 Ti or Radeon RX 6700 XT if you're using a high-end CPU, and that's because the CPU has the primary performance limiting component, not the GPU. The 1440p requires the RTX 3080 or 6800 XT, and our 1440p CPU and GPU specs are very similar with only a 10% difference in performance between the two different test methods.
That is the CPU and system performance of Battlefield 2042 in a nutshell.
Bring a big CPU because you will need it …
- Intel Core i9-12900K on Amazon
- Intel Core i7-12700K at Amazon
- Intel Core i5-12600K at Amazon
- AMD Ryzen 9 5950X on Amazon
- AMD Ryzen 9 5900X on Amazon
- AMD Ryzen 7 5800X on Amazon
- Intel Core i9-11900K on Amazon
- Intel Core i7-11700K at Amazon
- Nvidia GeForce RTX 3070 Ti on Amazon
- AMD Radeon RX 6800 XT on Amazon