Make no mistake, FidelityFX Super Resolution (or FSR for short) is AMD's direct competitor to Nvidia's DLSS, a feature that has become a major selling point for GeForce graphics cards over the past 18 months. It took AMD a long time to prepare its own upscaling feature, but as of today AMD is competitive and now needs to support FSR in more games.
This article will provide a comprehensive overview of FSR. We have a batch of quality comparisons at different resolutions, performance benchmarks across multiple GPUs, and comparisons with other upscaling technologies. There is a lot of ground to cover.
What is FSR?
FSR is an upscaling technology designed to improve game performance with minimal loss of visual quality. As with DLSS and checkerboard rendering, the concept is that you can reduce the game's rendering resolution and use an algorithm to upscale the image to your target resolution, usually the native resolution of your monitor. Reducing the rendering resolution greatly improves performance, and if the upscaling technique is good enough this can be achieved while preserving most of the image details.
The most popular form of this today is Nvidia's DLSS, the company's proprietary AI-based temporary upscaling solution that runs on GeForce RTX GPUs. Temporal upscaling means that data from several frames is collected and combined to form the final image, with the AI component running on Nvidia's tensor cores to support this reconstruction. DLSS has gone through more than one iteration and is currently a major improvement over the original release in version 2.0 and has also received decent game support after a lot of work from Nvidia.
FSR takes a different approach. Instead of using temporal upscaling, FSR relies exclusively on spatial upscaling. AMD tells us that AI is not used in any phase of the FSR process (so FSR is not the technology described in this patent floating around). This greatly simplifies the algorithm – spatial upscaling is not based on data from multiple frames or motion vectors, which makes it easier to integrate into games as less data entry is required. However, since there is less data to process, spatial upscaling algorithms have to be really good at figuring out how to reconstruct the image, and traditionally they have come up short here.
AMD hasn't gone into great detail about how their algorithm works, but they tell us this is not a simple iteration of the bilinear upscaling that is the "standard method" for spatial upscaling. AMD calls their technique an "Advanced Edge Reconstruction Algorithm" which is combined with a sharpening pass to create the final image. There is only one input to the algorithm, which is the lower resolution frame.
Although it's a simpler technique, FSR still requires one integration per game. This is because FSR needs to run before the final effects phase in the pipeline, before the HUD is rendered and before things like film grain are implemented. If FSR were applied to a game's final frame output, many elements (like the HUD) would be upscaled, likely with artifacts and other visual issues. By optimizing each game, you can ensure that only the intended part of the game runs with a reduced render resolution.
AMD chose the spatial upscaling approach for two reasons. The first is ease of integration. FSR is a single compute shader with simple data entry and is open source technology. The second is that by dispensing with AI, AMD hopes for broader support and can support AMD GPUs of the Radeon RX 480 era (2016) and newer as well as Nvidia GPUs of the GeForce 10 series and newer from the start. That means popular mainstream GPUs like the GTX 1060 and GTX 1660 can take advantage of FSR.
FSR vs. DLSS
Some people compare how FSR works to how DLSS works and might conclude that FSR is not really a competitor to DLSS because they work in radically different ways. But in my opinion that's not true. In practice, both technologies were designed and developed to achieve the same goal: to improve game performance by rendering at a lower resolution, but with minimal loss of visual quality from upscaling.
It's only fair to compare them head-to-head, and we'll do that later in this article.
FSR offers 4 quality modes: Ultra Quality, Quality, Balanced and Performance.
Each has its own scaling factor: 1.3x for Ultra Quality, 1.5x for Quality, 1.7x for Balanced and 2.0x for Performance. This means that with a target resolution of 4K, quality mode runs at 1440p and performance mode runs at 1080p, with the other modes being somewhere in between.
FSR supports any resolution, it is not limited to 4K and 1440p, and the scaling factors apply equally. FSR also works with dynamic resolution scaling and theoretically allows any scaling factor, but for now AMD keeps it simple with four fixed modes.
FSR launches today with support in 7 games, but by my count only 4 of them will actually be released as the final version. The bigger names are Godfall and Anno 1800.
AMD has promised support for other titles soon, including Far Cry 6, Resident Evil Village, Dota 2, and Baldur’s Gate 3 as some of the most recognizable names. Necromunda Hired Gun will also be interesting as this game already supports DLSS, so it could be our first true DLSS vs FSR comparison.
But at the end of the day, that support list is just a promise. Nvidia did the same when they launched DLSS, but many of those early titles never supported DLSS and it took them significant effort (and time) to bring DLSS 2.0 into a decent selection of titles.
Comparison of the image quality
For the next section on quality comparison, we took picture and video recordings with a Radeon RX 6800 XT with a new driver that adds FSR support. Inside our test system, a Ryzen 7 5800X ran together with 16 GB of DDR4-3200 memory.
This comparison works much better for live video and commentary. You can check out my Hardware Unboxed video below for full details. Or you can read my conclusions under the video followed by FSR performance benchmarks.
FSR holds up quite well when using Ultra Quality or Quality modes at 4K. Without zooming in, these modes are similar to native rendering in Godfall, which is the kind of result you want to get. Switching to quality introduces some softness, but it holds up well with elements like foliage too. The balanced and performance modes, however, are a clear trade-off in terms of quality and, in the latter case, blurring. Not the image quality you would expect from native 4K.
I was impressed with FSR's ability to preserve fine details with Ultra Quality and Quality modes. The image quality also holds up well in a game like Anno 1800, another title that has a lot of fine details in its native rendering, overall nice and sharp.
If you care about picture quality at all (which you probably are if you're playing at 4K), there is no way you can play in performance mode. I would decline other image quality settings to improve frame rate using FSR performance as it is basically a blur filter.
For 1440p image quality, FSR isn't quite as good at this lower resolution, and there is a narrower range of quality modes that I think are usable. There are only minor differences between the native 1440p image and 1440p with FSR in ultra quality mode, but when we switch to quality mode the image becomes softer. As expected, the image quality drops sharply in balanced or performance mode and is rather blurred, especially in performance mode. FSR just isn't very good at upscaling a 720p image to 1440p and maintaining the clarity of the native display.
Then FSR at 1080p isn't amazing. Even in ultra-quality mode there is a noticeable loss of detail compared to the native display.
Benchmarks: FSR with Radeon GPUs
Now is the time to talk about performance. We had a few ways to test FSR performance based on the amount of time we had to play with it, and we decided to test a game on eight GPUs with three resolutions and all four FSR modes.
We could have tested more than Godfall, but the other games AMD made available were much more difficult to compare and if we had tested them we would not have been able to do a full visual analysis. Godfall was tested with epic settings, with no additional sharpening, no ray tracing, and no motion blur unless otherwise noted. We don't think it's worth enabling the ray tracing implementation in Godfall, so we didn't.
With the Radeon RX 6800 XT, the increase in performance at 4K is impressive. Switching from native to Ultra Quality FSR increased the average frame rate to 100 FPS, a 44 percent improvement. The quality mode can also be used with this resolution and has been increased by 65 percent compared to the native mode. The modes below improve this even further, but do not deliver 4K image quality.
At 1440p, with such a powerful GPU, we're more CPU-limited, especially since we're lowering the rendering resolution. Even so, switching from 1440p native to Ultra Quality FSR still resulted in a 25 percent performance improvement with minimal impact on visual quality. Quality mode is approaching a 40% increase, but I wouldn't recommend it.
Interestingly, we also see the overhead that FSR requires as the 4K quality mode natively renders at the same resolution as 1440p. With this RDNA2 GPU we are seeing an 8% reduction in frame rate compared to native rendering, which is not nothing but is on the whole small for something that runs on the same shader units as the game itself.
The Radeon RX 5700 XT benefits in a similar way to the 6800 XT. At 4K, I achieved a 41% increase in ultra quality mode compared to native and an improvement of 66% in quality mode.
Quality mode basically makes the 5700 XT a 4K capable GPU in this game, as performance soars from a hard-to-play 35 FPS to almost 60 FPS. The 5700 XT also benefits more than the 6800 XT at 1440p, with a 29% increase from native rendering to FSR Ultra Quality.
Vega 64 is the next AMD GPU we have to test using previous generation architecture. At 4K we see a performance improvement of 39% for the ultra quality mode and 60% for the quality. Then Vega 64 benefited to a lesser extent than previous GPUs at 1440p, with only a 20% increase for ultra quality mode. This seems to be due to a higher performance overhead with the older architecture, a 10% reduction in frame rate compared to 4K quality with 1440p native, higher than the 8% with RDNA2.
With the RX 570 4GB, Godfall doesn't play well at all unless we use medium settings as the game requires much more than 4GB of VRAM under all conditions at higher settings.
FSR does not solve the VRAM restrictions to this extent. But with the medium settings, I saw a significant advantage in switching from 1440p native to 1440p Ultra Quality, a 47 percent increase that gets this GPU out of the unplayable zone at this resolution.
Benchmarks: FSR with GeForce GPUs
FidelityFX Super Resolution is also supported on Nvidia GPUs, so let's test that.
The RTX 3080 is more prone to CPU limitations when we lower the resolution and is not as powerful as the RX 6800 XT at lower resolutions. Because of this, we only saw a minimal 14% increase in performance at 1440p: we're basically rendering at 1080p here, with the GPU being limited. At 4K, the increase in performance was more like that of the 6800 XT, but to a lesser extent …
The Ultra Quality mode allowed the frame rate to be increased by 34% compared to 44% on the 6800 XT. Quality was a 53% increase compared to 65% for the corresponding AMD GPU. However, these numbers are nothing to sneeze at for 4K gamers after an increase in performance with minimal deterioration in image quality.
The RTX 2070 plays very nicely with FSR. At 4K we saw a performance increase of 46 and 74 percent for the Ultra Quality and Quality modes. These are some of the highest results yet, and it seems that the lower the native frame rate, the more FSR can increase your frame rate.
At 1440p we saw a 30 percent increase in performance with FSR Ultra Quality, which is similar to the 5700 XT at this resolution. The performance overhead of this Nvidia GPU is the same as the overhead of AMD's RDNA2 GPUs, so it doesn't look like this feature was written in any specific way to degrade Nvidia GPUs performance.
With the GTX 1660 Super, you can expect huge increases in performance from FSR at certain resolutions because the card isn't as powerful as some of the others we tested. At 4K, I've seen a 59% increase in performance from 4K native to Ultra Quality and almost doubled the performance in Quality mode. Meanwhile, the 1660 Super also benefited from 33% higher performance at 1440p, going from native to ultra quality FSR.
With the Pascal-based GTX 1070 Ti, the performance story is very similar to that of the previous eight GPUs tested. A performance increase of 52% and 82% for ultra quality and quality at 4K, respectively, with an increase of 29% for 1440p ultra quality.
This makes the 1070 Ti a much more playable GPU at both resolutions, and while I probably wouldn't use that card at 4K in this game, the gains at 1440p are very decent to improve the smoothness of gameplay.
The FSR State: What We Have Learned
After this first look at AMD's FidelityFX Super Resolution, time to share my thoughts. I have to say I'm surprised what AMD has achieved with FSR. At the time of the announcement, when AMD was explaining how FSR would work, I expected it to be decent but not overly impressive and ultimately to be somewhere between DLSS 1.0 and 2.0 in quality. But the reality is that FSR is pretty decent in certain circumstances and is sometimes competitive with DLSS 2.0.
Based on my tests so far, my recommendations are simple: At 4K, both Ultra Quality and Quality modes come pretty close to native rendering, while Ultra Quality mode gives a performance increase of ~ 40% and Quality ~ 65%. offer a range of GPUs from AMD and Nvidia. I think most gamers will happily accept this type of performance boost for the relatively small visual quality impact FSR has.
FSR can also be used at 1440p, but only in Ultra Quality mode. With this setting, the image quality is pretty good and approaches that of the native. Assuming you're not CPU-limited when it comes to lowering the render resolution, I've usually been able to get ~ 30% performance gain in Godfall.
The other modes are not particularly useful. Regardless of the resolution, the Balanced and Performance modes suffer from a noticeable loss of quality and the introduction of artifacts such as shimmer. Even at 1080p, FSR doesn't really perform well, even with Ultra Quality mode enabled. FSR is better than traditional upscaling, so these lower modes may come in handy for built-in graphics when it's better than nothing.
It's more complicated when comparing FSR and DLSS 2.0. At best, FSR is pretty competitive with DLSS 2.0, although we still can't compare the two in the same game. However, based on my extensive testing and retesting of these techniques, the image quality that FSR provides in the higher quality modes is very slightly behind DLSS while providing a similar increase in performance. It also does not suffer from ghosting in motion, since FSR is not a temporal solution and the performance overhead appears to be lower when using FSR for a given rendering resolution than with DLSS.
However, DLSS 2.0 is significantly better at upscaling lower render resolutions, e.g. B. when transforming 1080p to 4K. A spatial upscaler is just not as good as a temporal upscaler, which can gather information from more sources when reconstructing the image. In practice this means that DLSS 2.0 is much better in the Balanced and Performance modes compared to FSR; Where FSR can be fuzzy, DLSS retains more details. Both techniques can suffer from shimmering or similar artifacts depending on the situation.
While DLSS 2.0 can be technically superior in some situations, AMD's antithesis is broader support and ease of integration. FSR works on low-end products, even those from Nvidia, which is pretty neat. A $ 230 (in normal times) GPU like the GTX 1660 Super isn't fast enough to run Godfall at 1440p with epic settings, but with a 33% performance boost from FSR in Ultra Quality mode, that resolution gets closer to reality, without turning down all other settings. Nvidia is not serving this customer with DLSS, which puts us in this weird situation where AMD is supporting Nvidia's mainstream GPU buyers better than Nvidia.
It will also cause all sorts of discussions among developers. With limited resources, are they integrating FSR, which can appeal to a wider audience but not quite as good as DLSS 2.0, or are you going for the better quality option in DLSS that is limited to RTX GPU owners? Obviously, including both is the preferred option, however, this may not be feasible in most studios, and even some large developers can take the easy route here. There is a lot to play in the next few years and this seems to be AMD’s strategy.
It's nice to have more GPU support, but it means very little if FSR isn't implemented in more games. Nvidia had a multi-year head start to develop its DLSS ecosystem and that is paying off now that DLSS 2.0 is supported in many major titles.
It's all well and good to show 7 games at startup, but honestly, I don't am thrilled with the list of FSR games you can play right now. DLSS 2.0, on the other hand, is supported in Fortnite, Call of Duty Warzone, Cyberpunk 2077, Metro Exodus Enhanced Edition, Death Stranding, etc. These are great games, including titles that are popular to this day.
Now, of course, AMD has to start somewhere, and starting more than one game with multiple promised and decent technology is better than I expected. Not every game has DLSS support as well, but the point is that FSR is not currently a selling point for AMD, just as DLSS was not a selling point for Nvidia when it launched. It will take some time for FSR to be supported in a decent number of games while DLSS is miles ahead and you might be resorting to Nvidia's GPUs today.
- Nvidia GeForce RTX 3070 on Amazon
- AMD Radeon RX 6700 XT on Amazon
- Nvidia GeForce RTX 3080 on Amazon
- AMD Radeon RX 6800 on Amazon
- AMD Radeon RX 6800 XT on Amazon
- AMD Ryzen 9 5950X on Amazon
- AMD Ryzen 5 5600X on Amazon