GeForce GT 1030: The DDR4 Abomination Benchmarked

Today we're going to look at what is one of the worst graphics cards ever released. But before we get to that … imagine you are on a tight budget, some of you probably don't need to be too creative, after all, PC gaming is something most of us do to get away after a day's work relax. Relax a bit and have fun.

Therefore, not everyone can afford to invest hundreds or thousands of dollars in PC games when there are other bills to pay. You could also be a small child who deals with PC games and computers. This can be difficult for some, most parents will be more interested in the idea of ​​buying a console than a computer, which I think is a mistake, but we won't go into that here.

The point is that all of these people don't have to spend an infinite amount of money upgrading or building a new PC, so they're looking for an affordable graphics card. Most of the time, they are just starting their PC gaming journey. So you're not that experienced when it comes to buying a graphics card.

Some light tests will show you that graphics cards under $ 100 like the GeForce GT 1030 in Fortnite at 1080p can deliver over 60 fps when competitive quality settings are used. So that will be enough for most, and although something like the GTX 1050 only costs about $ 30 more, the GT 1030 does exactly what you need. So why spend the extra money? Of course, we could come up with a few good reasons, but I understand why so many people buy these graphics cards for beginners.

So once you've seen the benchmarks, you know that your system can produce at least 60 fps with the GeForce GT 1030. So buy one for the typical price of $ 90. However, what you don't know is that not all GT 1030 graphics cards are the same, not even nearby. You could buy a GT 1030 that actually spits over 60 fps at Fortnite at 1080p, or you could get one that barely keeps the frame rate above 30 fps.

The problem for you is that both have the same name, exactly the same name. You can assume that the 50% slower version is called GT 1030 LE or 1030 SE, or better still, a completely different name such as GT 1020. Instead, Nvidia has chosen to mislead customers and believe that all GT 1030 graphics cards are the same.

Then what exactly is going on here? Well, more than a year ago, in May 2017, Nvidia released the GT 1030. The cheapest GPU of the GeForce 10 series, the MSRP was only $ 70. 384 CUDA cores were packaged at 1.28 GHz and a boost frequency of 1.47 GHz, and the GDDR5 memory was clocked at 1.5 GHz and with a memory bus of only 64 bits with a bandwidth of 48 GB / s provided.

The GT 1030 didn't blow off socks, but it was what it was, an extremely affordable GPU of the current generation.

From March 2018, however, budget buyers could no longer buy a GT 1030 and knew exactly what they were getting, at least not without considering the type of memory used. This is because Nvidia has quietly introduced a DDR4 version that uses the type of memory that modern desktop PCs use as system memory. This is a big problem because DDR4 memory is significantly slower than GDDR5 memory.

During this transition, the memory base frequency was reduced by 30%, and as bad as it sounds, the end result is far worse. The GDDR5 memory has two parallel connections that offer twice the I / O throughput compared to DDR4. This means that the storage frequency has actually been reduced by 65%. Since we are still using the same 64-bit memory bus, this also means that the memory bandwidth has been reduced by 65% ​​from anemic 48 GB / s.

The end result is the DDR4 version of GT 1030 with a 16.8 GB / s connection to its 2 GB memory buffer. That is only slightly more bandwidth than the miserable DDR3 version of the GeForce GT 730, which was released in 2014. For some reason, Nvidia also reduced the core frequency and lowered the DDR4 version to 1.15 GHz, which corresponds to a reduction of 6%.

So the DDR4 version has much less memory bandwidth, but the same name and looking around online seems to come at about the same price. How concerned should you be as a potential buyer of a GT 1030? Nvidia didn't need to change the name. So can it really be that different? Well, we'll find out in a moment.

To test all graphics cards, we performed benchmarking for our Core i3-8100 test system with 8 GB DDR4-2666 memory. For comparison, the Ryzen 5 2400G APU with integrated Vega 11 graphics card with 8 GB DDR4-2666 memory is also used. Let's get to the results …

Benchmarks

To test Battlefield 1, we chose the low quality preset. We normally now test low-end graphics cards and even APUs with medium quality settings, but at 720p, the 1030 GT DDR4 version was barely able to deliver playable performance. Using the low quality preset allowed an average of 51 fps and that doesn't seem like a bad thing until you find that the GDDR5 version is 104% faster.

If you switch to 1080p and now to the DDR4 version of the GT 1030, you can't even reach 30 fps with the lowest possible quality settings. The Ryzen 5 2400G APU was 64% faster and much worse than the fact that the GDDR5 version, the original GT 1030, is 118% faster. 118%, I'm not even sure what to say.

With the lowest possible quality settings in Prey at 720p, the DDR4 GT 1030 managed to average a little over 60 fps, which in turn meant that the GDDR5 model was about twice as fast.

If you increase the resolution to 1080p, the GDDR5 model is now more than twice as fast as the patchy DDR4 version. Again we see that the DDR4 version with the lowest possible quality settings cannot even reach 30 fps on average.

Far Cry 5 is a new title and pretty well optimized. However, if you bought the DDR4 version of the GT 1030, you will have a bad time. Even at 720p with the lowest possible quality settings, we couldn't get an average of 30 fps, while the GDDR5 model was almost 80% faster, which is the smallest margin we've seen so far.

Here are the 1080p results and what can you really say? The GDDR5 version of the GT 1030 is shit here, but the DDR4 model is something else, pure garbage that it really is.

Although DiRT 4 was playable with the lowest possible quality settings, it was still far from what the GDDR5 versions can do.

At 1080p, the DDR4 version drops significantly below 60 fps, while the GDDR5 model was able to keep the frame rate above 80 fps at all times.

I suspect many GT 1030 graphics cards are currently being purchased by those looking to get into Fortnite action, and if so, those caught by the DDR4 version will get a liver. Results like this show that when using the medium quality settings at 720p, the GT 1030 always pushes over 100 fps. The DDR4 version has problems keeping the dips above 60 fps, but I think it's playable, so there is.

Of course, most of them don't want to play at 720p if they can avoid it and avoid that they can't with the DDR4 version. Here we only see an average of 38 fps, while the original GT 1030 spat out an average of 66 fps.

If we move on to the Rocket League, we see an average of only 60 fps at 720p with the quality preset "Quality". In this case, the GDDR5 model was almost 90% faster with an average of 114 fps.

At 1080p, the DDR4 GT 1030 scratches past with an average of 32 fps, and at this memory-intensive resolution, the GDDR5 model was 122% faster, which is just ridiculous.

The last game we're going to watch is Rainbow Six Siege. Here we see that using the lowest possible quality options at 720p, the DDR4 GT 1030 is good for 73 fps, which is not bad, apart from the fact that the GDDR5 model is almost 80% faster at 130 fps.

At 1080p, the performance of the severely restricted DDR4 models is again severely limited, and here we saw an average of 39 fps, which made the GDDR5 model 64% faster.

Not that it matters in the slightest, but here are the power consumption figures. Because the memory clogs the GPU, the DDR4 version uses less power and reduces overall system consumption by 16%. In light of this, we have often noticed a 50% power reduction that significantly worsens the DDR4 version in terms of performance per watt.

Conclusion

Impressive. Where do you start with such a product? I still agree that this product exists. How the hell is that?

Actually, I should be clear about this, as stupid as the DDR4 GT 1030 is, I don't have a major problem with its existence. The real problem is that Nvidia had the gall to call it GT 1030 and not GT 1020. I don't even know how to get away with it.

It is often 50% slower, but has the exact same name and price. How is that possible again?

It's not like we tried to degrade performance or test under unusual conditions to make it look worse than it really is. Almost all games have been tested with the lowest possible quality settings of 720p and 1080p. The GDDR5 model can actually process higher quality graphics at 1080p. This would further paralyze the DDR4 version, so we're actually showing a best-case scenario, if any.

I can't imagine how upset I would have been to save to update my graphics card just to end up with this. I couldn't imagine buying another GeForce product. I would also be very upset with the retailer who sold the product, although I realize that this is not the problem here.

This actually led me to ask some local retailers how they judge the DDR4 version of GT 1030 and do they ever see any kind of backlash from such things.

For obvious reasons, I'm not going to name any of the retailers' names, but I can tell you what they said, and everyone said pretty much the same thing: first, the big problem for retailers is that they often don't know exactly which models to buy, and I know that sounds silly, but it makes sense. The people in these roles are often not geeks. They pay attention to product codes and prices and not what type of memory a graphics card uses. You will see that the GT 1030 inventory is low and you will order more GT 1030 stocks. It's not like a cheap GT 1030 brand did before the DDR4 mess most brands offered 4 or 5 models.

So many of them unwittingly bought DDR4 GT 1030 stock, provided it was exactly the GDDR5 GT 1030 stock they bought last year. The specifications are delivered frequently and then simply copied and pasted onto the website. This is not uncommon.

Almost everyone admitted that they would not have been wiser without the controversy surrounding the specification change, and it would have been a coincidence if they had even noticed the change. They also said that if the model name is not changed or there is an obvious difference, this stuff often goes under the radar.

This then leads to massive problems for these retailers as they unwittingly sell their customers inferior products and for obvious reasons this is not the case. So Nvidia not only hurts its customers, it may also hurt retailers and its own board partners. This is a bad look for everyone involved and needs to be fixed soon.

Leave a Reply

Your email address will not be published. Required fields are marked *