How to bypass the HDMI 2.1 issue using an AMD gpu in Bazzite (Linux)

Due to a legal issue between AMD and the HDMI forum group, right now the support to HDMI 2.1 in Linux using and AMD gpu is missing. Basically since AMD drivers are open source (and this is the reason for which AMD gpu in Linux works perfectly and Nvidia ones are a pain in the a$$), the HDMI forum group has rejected the open source implementation for the 2.1 because they want their stuff to be closed source. The result is that, even if we use a GPU with a HDMI 2.1 port, a monitor/TV and cable with the same HDMI 2.1 hardware implementation, software-wise we are locked down to the HDMI 2.0 implementation. This means that we have only 18gb of bandwidth at disposal instead of 48gb.

So, generally speaking, HDMI on AMD gpu in Linux works but has strong limitations. Practically speaking it means that we can’t have all the video features turned on at max settings (4k resolution, VRR, full HDR 12bit, 120hz+ refresh rate, Chroma 4:4:4) but we have either to: lower the resolution, disable VRR, disable HDR or let the TV/monitor automatically fallback to subsampling (by reducing HDR to 8bit and Chroma to 4:2:0 for instance). So, it’s not true that we can’t get 4k @ 120hz on Linux using AMD, as many seem to believe by reading comments online, but the color info will be remarkably dropped to fit the limited bandwidth. For instance, out of the box I’m getting 4k @ 120hz + VRR + Limited HDR (8bit) + Limited Chroma (4:2:0) by selecting 3840×2160 (120hz) in display settings on my LG C4. This might be a huge issue or not an issue at all depending on your monitor/TV and your personal perception!

The only resolution which allows full HDR definition (12b) + VRR with AMD FreeSync + 120hz refresh rate + full Chroma (4:4:4), at least on my TV (LG C4), is 1920×1080 (aka Full HD / 1080p). Of course playing at 1080p having a top tier hardware which is capable of providing an average of 70 FPS @ 4k is not an option to me, but what if I tell you that we can have a visual output quality that is close to native 4k even by using a display resolution of 1920×1080? It sounds crazy but this is an AMD feature called Virtual Super Resolution (VSR), which as AMD itself writes, allows to get quality that rivals up to 4K, even on a 1080p display while playing your favorite games. On Windows this feature is implemented directly in the drivers, while on Linux is not, anyways on Bazzite we can use VSR by doing the following:

  1. Configure the display setting as 1920×1080 @ 120hz in Gamescope UI (the Bazzite UI). This will “make room” to pass full color profile (HDR 12bit and Chroma 4:4:4) via HDMI.
  2. Ensure that both “automatically scale Image” and “automatically scale UI” are turned off in Gamescope UI
  3. Set max game resolution to 3840×2160 (4k) in Gamescope UI
  4. Set the game resolution to 3840×2160 (4k). This will use the GPU to create the higher and most possible detailed source image.
  5. Set the game window mode as “full screen
  6. Configure the Bazzite scaling filter in the “quick access menu” (the one opening on the right an triggered by “Xbox button” + “A” if you are using an Xbox controller) to use the sharp algorithm to get the most clean and detailed image possible when rendered on the TV. Here we can use the slider to set a value ranging from 0 to 5, the higher the value the most detailed the resulting image will be, but also the higher the noise and “artifacts” produced especially for certain textures like clothes fabric (finding an NPC wearing a sweater, jeans or something else that is not a flat and uniform fabric is very helpful to calibrate the value ^_^). The best trade off for this value is usually 2 or 3 if the game has a good HDR and antialiasing implementation. In the worst case 0 should avoid visual glitches in the worst of the games, while 5 should be generally always avoided because produces too much “noise” in the output.

By having deep color info and properly configured anti alias, the final result is something unbelievable, it’s not the same as native 4k of course, but close enough. The colors and image details are way better compared to setting the display resolution to 3840×2160 and triggering the subsampling on the TV.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top