Today we're revisiting AMD's budget-driven Ryzen 3 3300X. This Ryzen 3 CPU impressed us very much when it started about two months ago. Since this first test, however, we have hardly looked back to see where it is. In perspective, many things have happened inside and outside of technology in recent months. The supply of components was scarce and you may recall that AMD had decided at the time to discontinue support for all 400 series motherboards.
If you followed this drama, you will know that we spent a lot of time and energy fighting AMD, and finally made them withdraw that decision.
When the problem was resolved, we moved on to the 10th generation Intel Core test. Then we had the Z490 motherboards, which turned into B550 tests.
The plan has always been to do some detailed Ryzen 3 testing, and we started laying the groundwork for this GPU scaling content many weeks ago. However, the timing was not very good because the inventory was practically non-existent and we therefore used the data as a basis. Ultimately, we decided not to delay the blue bar graphs any longer. So when things return to normal, you will be provided with even more information.
For this test, we compared the 3300X with the Ryzen 5 3600 and the Ryzen 5 2600. In previous GPU scaling benchmarks, we also tested the 3900X and the 9900K, although this time we decided to leave them out to decrypt the graphics. However, we will retrieve some of this data towards the end of the article.
Before we get to that, some basic test notes … We have paired the CPUs with the FlareX DDR4-3200 CL14 memory from G.Skill and a Corsair H115i Pro cooler. All automatic overclocking functions such as MCE or PBO were deactivated and the memory was not optimized, but only loaded with XMP. So, with a high-quality AIO and low-latency storage, performance is ready to go.
Note that the test setup for this article is slightly different from what we used for our recent CPU tests, and that this affects the results. We only use two memory modules for one-tier operation, while in the last CPU test reports we used four modules for two-tier memory. Some of the areas used to test the games have also been updated in these articles.
Starting with Assassin's Creed Odyssey using the medium quality preset at 1080p, we find that the 3300X is positioned between 2600 and 3600 when using the RTX 2080 Ti, making the R5 3600 about 15% faster. However, if you fall back on the RTX 2070 Super, this scope is almost completely eliminated. Now the 3600 is only one frame faster. The 3300X was still 11% faster than the 2600, which is a decent advantage over the second generation part.
Until we reach the Radeon RX 5700, however, we see very similar performance for all three processors. The 3300X is only 5% faster than the 2600 and ~ 2% slower than the 3600 as we are more GPU limited.
Once we use the Radeon RX 580, we are completely tied to the GPU and the results we see here are largely within the margin of error. Needless to say, you won't notice any differences between the 3600 and 3300X when playing under these conditions.
If we increase the quality preset by a few steps to the maximum setting, the results will not be much more GPU-bound and more homogeneous. Remember that we're still only testing at 1080p. It doesn't matter what type of graphics card you have, RX 580, 5700, 2070 Super or even the 2080 Ti, you will get practically identical frame rates with each of these CPUs.
Same game, but we look at the results with a resolution of 1440p and the medium default. In the GeForce GPUs, this data is much more limited to the GPU. The 3300X and 3600 delivered identical or practically identical performance with all four GPUs, making them slightly faster than the 2600.
Finally, we have the Assassins Creed Odyssey results with the ultra high + 1440p. No surprises, we are almost exclusively bound to the GPU and within this group of CPUs there is no difference in gaming performance. If you want to play this and similar games with maximum quality settings, you will mostly find that the system is tied to GPU performance.
Moving to Far Cry New Dawn, we find a similar average frame rate performance between 3300X and 3600 here, but the 6-core processor delivers better 1% low results, suggesting that we have a slightly more consistent performance with the high-end achieve CENTRAL PROCESSOR. It is also interesting to note that although the Ryzen 5 2600 is slower for both average and 1% low performance, there is only a 23% deviation between the two metrics, while the 3300X shows a difference of up to 29 %, which again suggests that the experience gained may not be as smooth.
Interestingly, when using the Radeon RX 5700, the average frame rate of the 3300X drops from that of the 3600. We have found in the past that the AMD drivers produce more overhead. Maybe we'll see this here. When we fall back on the RX 580, things come together, but even here we can still see that the increased latency of the Zen + architecture is a problem for the 2600.
We find more interesting Far Cry New Dawn results at 1080p if we increase the default quality to "Ultra". This seems to increase CPU usage, and the 3300X is now up to 10% slower than the 3600 when you compare average frame rates.
When comparing 1% low data, we also see a significant decline for the 2600. The performance trends for the RX 5700, 2070 Super and 2080 Ti look similar, and only when we fall back on the more modest RX 580 do we see similar frame rates.
In the 1440p results with the normal quality settings, there is a smaller gap between 3300X and 3600 compared to 1080p when using the RX 5700, and we also see a significant difference in performance between the RX 5700 and RX 580. This is of course due to the fact that the game at the higher resolution is more limited to the GPU.
The 1440p ultra-quality data only show a very slight difference between the 3300X and the 3600 with the RTX 2080 Ti. At 1080p the 3600 was up to 10% faster, here this margin was halved to 5%.
When testing with Tom Clancy's Rainbow Six Siege and medium quality 1080p data (labeled "high" in this game), we have some other interesting data. With the RTX 2080 Ti, the 3600 is only 4% faster than the 3300X when comparing averages, but 14% faster when comparing 1% lows. As a result, the 3600 has a 27% performance difference between its average and a low 1%, while the 3300X has a difference of almost 40%, suggesting that the 6-core processor offers a smoother and more consistent gaming experience.
This difference is somewhat neutralized in the slower RTX 2070 Super, and the 3300X actually creates a closer grouping of frames here due to a stronger GPU bottleneck. We see this effect continue as we lower GPU performance.
For GPUs like RTX 2070 Super, Radeon RX 5700 and RX 580, if we increase the graphics default to Ultra while maintaining a resolution of 1080p, we find that 3300X and 3600 offer the same performance, since both are very GPU-limited.
We see a big drop in performance if we just increase the resolution. With RT40 2080 Ti with the "High quality" setting, the RTX 2080 Ti with the Ryzen 5 3600 has dropped to less than 200 fps. As a result, we see very little performance difference between the processors tested. The 3600 is a few percent faster with the RTX 2080 Ti and then identical performance is achieved with the 2070 Super.
For those of you who play at 1440p and use the highest quality settings, there's more evidence here that the CPU isn't as relevant as the graphics chip that powers your game. It is impossible to distinguish between the CPUs used here, and the 3300X was able to maintain well over 100 fps with the RTX 2080 Ti at all times.
The last game we're looking at is World War Z. When testing at 1080p with the medium quality preset, there is practically no difference in performance between the 2080 Ti, the 2070 Super, and even the RX 5700, while the RX 580 is not far behind . Here we find a scenario that is almost exclusively limited to the CPU.
In all of this data, the 3300X was up to 11% slower than the 3600 with similar margins between the average and 1% low data. It is important to note that while the 3600 is significantly faster, the gaming experience was not better, at least not to the extent that you might notice.
Even if we increase the ultra quality settings, we see practically the same results.
The 1440p mid-range results aren't that different either, although we can see that the 3300X and 3600 come together with the Radeon RX 5700.
Only when we tested Ultra Graphics at 1440p, the game becomes a little more tied to the GPU. So now we see performance similar to that of the 3300X and 3600 with the RTX 2070 Super.
We know that four games are not a lot, but almost 300 benchmarks were required to include the Ryzen 3 3300X in this comparison. The games used should also cover most performance scenarios. In this case, let's see what the average performance looks like, and to make further reference comparisons, we add the Ryzen 9 3900X and the Core i9-9900K to the following diagrams.
At 1080p with medium graphics settings, using the fastest available RTX GPU today, the Ryzen 5 3600 is ~ 11% faster than the 3300X when comparing 1% low data. This is a reasonable increase in performance, but when you consider that it is a 50% increase in the number of cores, it really isn't that important. In addition, the 3300X averaged 120 fps at all times in these tests.
If you're not using an RTX 2080 Ti and instead have a $ 500 graphics card like the RTX 2070 Super, that margin will decrease from 11% to just 4% or 6% on a Radeon RX 5700. With a more modest GPU like an RX 580 under $ 200 there is no significant margin.
If we increase the resolution of the games to 1440p while we are still playing with medium settings, we see that the 3600 is now only 5% faster than the 3300X with the RTX 2080 Ti. Similar margins are with the RTX 2070 Super and RX 5700 to see.
If we look at the Ultra Quality preset, we see a fairly typical scaling when we go through the GPU stack. With the RTX 2080 Ti, the 3600 is 8% faster than the 3300X when you look at the 1% low data, then 6% faster with the RTX 2070 Super and only 5% faster with the Radeon RX 5700.
Finally, a look at the 1440p Ultra data, probably the most realistic scenario for someone who rocks a high-end graphics card like the RTX 2070 Super or the RTX 2080 Ti. Here we only see a 5% difference between the 3600 and the 3300X. This means that in the games we just tested, the 1% low performance of the Ryzen 9 3900X was only 11% better than that of the 3300X.
These margins shrink considerably on the RTX 2070 Super, where the 3600 was only 3% faster than the 3300X, while there is practically no difference in the Radeon RX 5700.
What we have learned
Two months ago, we called the Ryzen 3 3100 and 3300X new budget champions. Since then, Intel has released newer 10th generation core chips, including the Core i3 series, but these new parts couldn't dethrone Ryzen 3. The only problem at the moment is that you can buy the Core i3-10100, but you are unlikely to find a 3300X anywhere.
The R3 3100 and 3300X have been out of stock for weeks, which is disappointing and it is unclear when stock will return. Needless to say, there is a shortage of bearings in a number of PC parts. So it's not just the 3300X that is affected by the global pandemic, it's everything.
If the stocks were not high before 2020, there is a good chance that many products will either be overpriced or simply not available. For example, the Ryzen 5 3600 was launched in mid-2019. Due to the high demand for this part, AMD was able to bet that it would spit them out as soon as possible and build up a substantial inventory that they can now blow through.
As a result, the Ryzen 5 3600 is not only available today, but also at a discount that is usually sold for around $ 175 versus $ 200 MSRP. This makes the 3600 the cheapest Zen 2-based processor you can currently buy.
But what if the Ryzen 3 3300X were available? That might be the case if this article goes online or two weeks later … but if it were available for $ 130, should you buy it for games?
It's a quad core after all, and these are useless for games, right? Now, like last year, we featured in a feature titled "Are Quad Cores 2019 Dead?" Declared, it depends. We have found that 4-core / 4-thread quad-core processors have problems in a number of modern games and often suffer from poor frame-time performance, resulting in a noticeably poor gaming experience. However, 4-core / 8-thread processors like the 3300X with SMT (Simultaneous Multithreading) are much better in this regard and generally avoid stuttering performance in modern titles. For this reason, processors like the Core i7-4790K, 6700K and 7700K still work quite well today, while the Core i5 counterparts don't.
Another important consideration is the price. If the Ryzen 5 3600 drops to just $ 175, it only means a saving of $ 45 (25%) if you choose the 3300X. You get a core cut of 33%, but as we've seen in many cases, the impact on game performance is less than either number. If you want to upgrade an old system and save as much money as possible, being able to save $ 45 in CPU may be a pretty big deal – especially if you don't notice a difference in performance.
Even when compared to a class-leading gaming CPU like the Core i9-9900K (now the Core i7-10700K), the Ryzen 3 3300X isn't much slower in many games. Under realistic conditions – no RTX 2080 Ti with 1080p at medium settings where the 3300X ~ 20% was slower than the 9900K, this margin shrank to only 13% in ultra games or to 12% at 1440p where it was over 100 enabled fps on average.
If you can afford to switch to the Ryzen 5 3600, we recommend that you do so. The 3300X is a solid CPU and a good value, but it's right on the verge of potential for 4-core / 8-thread processors to become problematic later. It is far more obvious to us that the R5 3600 will be noticeably better than the 3300X in the next few years, while we are not sure whether the 3700X will be the same as the 3600, for example. We believe that we will have to wait a while until 6-core / 12-thread processors don't make it for gamers, regardless of how many cores the new consoles have.