FreeSync 2 is AMD's monitor technology for the next generation of HDR gaming displays. In an earlier article, we explained everything you need to know about it. Now we give you an impression of how you can actually use one of these monitors for some games and whether it is worth buying a FreeSync 2 monitor now.
The monitor I've tested with FreeSync 2 is the Samsung C49HG90, a foolishly wide 49-inch double 1080p display with a total resolution of 3840 x 1080. It has a 1800R curve, uses VA technology and is for DisplayHDR 600 certified This means that it has up to 600 nits of peak brightness, covers at least 90% of the DCI-P3 color gamut and has basic local dimming.
While this panel does not support the full DisplayHDR 1000 with 1000 nits of peak brightness for optimal HDR, the Samsung CHG90 offers more than just an entry-level HDR experience.
There are many supposedly HDR-capable panels that cannot shift their brightness beyond 400 nits and do not support a range larger than sRGB. However, the latest Quantum Dot monitors from Samsung offer higher brightness and a larger area than conventional SDR displays.
Although the Samsung CHG90 is advertised on its website, FreeSync 2 does not immediately support it. Therefore, a firmware update for the monitor must be downloaded and installed. This is not a great experience. As explained earlier, AMD announced FreeSync 2 in early 2017. However, this is the first generation of products that actually support this technology.
There will likely be many cases where users will buy this monitor, connect it to their PC without performing firmware updates, and simply assume that FreeSync 2 will work as intended. The fact that you may need to update the firmware isn't well advertised on the Samsung website – it's hidden in a footnote at the bottom of the page – and updating a monitor's firmware isn't exactly common.
If you buy a supported Samsung Quantum Dot monitor, make sure that it is running the latest firmware that introduces FreeSync 2 support. If the correct firmware is running, a FreeSync 2 logo appears on the Information tab on the on-screen display.
The graphics driver and software utility from AMD exacerbate this problem with the FreeSync 2 display firmware. While the Radeon settings show when your GPU is connected to a FreeSync display, no distinction is made between FreeSync and FreeSync 2. In the Radeon settings or part of Windows it cannot be determined that your system is connected to a FreeSync 2 display. Therefore it cannot be checked whether FreeSync 2 works, whether your monitor supports FreeSync 2 or whether it is activated.
The way the Radeon settings displayed FreeSync support was not changed after we updated our monitor to FreeSync 2 supporting firmware. This will confuse users and requires the attention of AMD …
How does FreeSync 2 actually work and how do you set it up?
Assuming FreeSync is enabled in the Radeon settings and on-screen display of the monitor – and both are enabled by default – it should be operational. There is no magic switch to get everything going, and no real configuration options. Instead, the main functions are either permanently activated, e.g. B. low latency and low frame rate compensation, or can be used as HDR if necessary.
To use FreeSync 2's HDR features, you must enable HDR if you want to use it. For Windows 10 desktop applications, this means that you switch to the display settings in the Settings menu and activate "HDR and WCG". This changes the Windows desktop environment to an HDR environment, and all apps that support HDR can forward their HDR-associated data directly to the monitor via HDR10. For standard SDR apps, which are currently the majority of Windows apps, Windows 10 tries to map the SDR colors and brightness to HDR because the mode cannot be switched automatically during operation.
Although Windows 10 has improved its HDR support with every major Windows update, it has not yet reached a point where SDR is correctly mapped to HDR. When HDR and WCG are activated, SDR apps look washed out and lack brightness. Some apps like Chrome are broken in HDR mode. Windows has a slider to change the basic brightness for SDR content. With our Samsung test monitor, however, the maximum supported brightness for SDR in this mode is around 190 nits, which is significantly below the maximum 350 nits of the monitor when HDR and WCG are used.
Now 190 nits of brightness are probably fine for many users, but it's a little strange that the slider doesn't match the full brightness range of the monitor. It is also a different control than the screen brightness control. If the monitor brightness is set to less than 100, you will receive less than 190 nits when viewing SDR content.
If all of this seems confusing to you, it's because it is. In fact, the overall Windows desktop HDR implementation is a bit messy, and if you can believe it, previous versions of Windows 10 were even worse.
This is not only the case with FreeSync 2 monitors, but also with all HDR displays that are connected to Windows 10 PCs. We currently recommend disabling HDR and WCG when using the Windows 10 desktop and only enabling them if you want to run an HDR app as this will give you the best SDR experience in the vast majority of apps currently available is not supported.
So how about games? This is certainly the area where FreeSync 2 monitors and HDR really shine, isn't it? It depends on. We tried a number of games that currently support HDR on Windows and were largely disappointed with the results. HDR implementations differ from game to game, and it seems that many game developers currently have no idea how to properly tonify their games for HDR.
The worst are the EA games. Mass Effect Andromeda's HDR implementation was known to be ashamed when HDR monitors were first shown, but the curse of bad HDR continues in Battlefield 1 and the newer Star Wars Battlefront 2. Both games have washed-out colors in HDR mode that look far worse than the SDR presentation, combined with a general dark hue of the image and poor use of the spectacular bright highlights of HDR. In all three of EA's newer games that support HDR, there's no reason to activate it because it looks so much better in SDR.
We're not sure why these games look so bad, as reports suggest EA titles look bad on TVs with better HDR support and on consoles too. We believe that the way the Frostbite engine is managed by EA HDR is fundamentally broken and hopefully can be fixed for upcoming games.
Hitman is one of the older games that supports HDR, and it doesn't manage HDR well either. Although the presentation isn't as blurry as the EA titles, the colors are still matte and the image is generally too dark, with little (if any) impressive highlights used. The idea of HDR is to expand the color gamut and increase the brightness range, but in Hitman everything just seems to have gotten darker and less intense. This is another game you should play in SDR mode.
Assassins Creed Origins has an interesting HDR implementation that allows you to adjust the brightness ranges to the exact specifications of your display. We're not sure if the game looks better in HDR or SDR mode. HDR appears to have better highlights and a wider range of colors during the day, but suffers from a strange lack of depth at night, making night scenes strangely less like night scenes. The SDR mode looks better during these night periods and may be slightly behind the HDR presentation during the day.
Assassins Creed Origins would look better on a full-array local dimming display that this Samsung monitor doesn't have, but it's definitely not the best HDR implementation we've seen.
Far Cry 5 is by far the best game for HDR. AMD tells me that this is the first game to support FreeSync 2's GPU-side tone mapping in the coming weeks, although the game doesn't currently support this HDR pipeline. Instead, as with most HDR games, Far Cry 5 uses HDR10, which is passed on to the display for further sound assignment.
Unlike most other games, however, Far Cry 5's HDR10 is quite good. The color gamut is expanded significantly to achieve more vivid colors, and we don't get the same washed-out look as many other HDR titles. Bright highlights are really brighter in HDR mode with a wide dynamic range, and in general this is one of the few titles that look much better in HDR mode. Good job Ubisoft.
Middle-earth: Shadow of War is another game with a decent HDR implementation. When HDR is activated, this game uses a much wider range of colors and the highlights are brighter. Again, there is no problem with dull colors or a washed-out presentation, which means that the HDR mode can improve the SDR presentation in practically every way.
How you activate HDR in these games is not always the same. Most titles have a built-in HDR switch that overrides Windows' HDR and WCG settings, so you can leave the desktop as an SDR and only enable HDR in the games you want to play in HDR mode.
Hitman is an interesting case where there is an HDR switch in the game settings, but displays a black screen when the Windows HDR switch is also activated. Shadow of War has no HDR switch at all, but changes to the Windows settings for HDR and WCG. This is annoying because you have to manually switch between HDR and SDR to get the best in-game HDR experience, but a decent on-screen SDR experience.
While the HDR experience in many games is currently pretty bad and actually much worse than the basic SDR presentation, we believe there is reason to be optimistic about the future of HDR gaming on PC. Some newer games like Far Cry 5 and Shadow of War have fairly decent HDR implementations that remarkably improve SDR mode, while many of the games with poor HDR implementations are a bit older.
When the HDR ecosystem matures, we should see more Far Cry 5s and less Mass Effect Andromedas in terms of their HDR implementation.
We're also not in the phase where games use FreeSync 2's GPU-side tone mapping. As mentioned earlier, Far Cry 5 will be the first to do this in the coming weeks. AMD claims that other games due to be released later this year will immediately include FreeSync 2 support.
It will be interesting to see how GPU-side tone mapping evolves, but it definitely has room to improve HDR implementation for PC gaming.
From today's perspective, however, we see little reason to buy a FreeSync 2 monitor until more games contain a decent HDR. It's just too successful to be worth the significant investment in a first generation FreeSync 2 HDR monitor. This is not currently the type of technology we can use early on as we should have a larger selection of HDR monitors later in the year, possibly with better support for HDR through higher brightness levels and full local array dimming and larger color scales. Until then, we should also take a closer look at the HDR game ecosystem, hopefully with more games that have decent HDR implementations.
That doesn't mean you should avoid these Samsung Quantum Dot FreeSync 2 monitors. In fact, they're pretty good in terms of gaming monitors. Just don't buy them specifically for their HDR functions, otherwise you may be a little disappointed at the moment.