Display Tech – The future looks good

BENNETT RING can be overclocked to 165Hz and is HDR ready

As the primary interface between the player and the game worlds we immerse ourselves in, your gaming display is one of the most important components in your entire setup. Graphics make up a huge part of the gaming experience, and showing them off in the best possible way relies upon a high-quality display. What’s the point in blowing $750 on a beastly graphics card if you’re going to then pump its polygons through a panel that doesn’t truly represent the colors the developers intended, or causes horrible motion blur during high-speed scenes? Displays are continuing to evolve at breakneck pace, and over the last year we’ve seen lots of new technologies and terms bandied about. But which ones are truly important, and what do they actually mean? We’re going to explain some of the most important new display technologies that have recently hit the market, or will do in the near future, and hopefully deliver a clearer picture on what these terms mean when it comes to viewing your games.

AMDs explanation of how HDR mimics the human eye much more accurately than older display technologies.

HIGH DYNAMIC RANGE DISPLAYS

Otherwise known as HDR, this is arguably one of the most important improvements in display technology in the last decade, and will certainly be the hottest display technology in the coming year. It’s being driven by the TV industry first and foremost, where the improved color quality brings films and movies to life in a far more noticeable way than a simple increase in resolution. The term itself actually covers a bunch of improvements in the way that today’s displays show off color, contrast and brightness.

The first improvement to HDR is the increase in color information. Today’s displays are capable of 6 or 8-bit color, with the latter able to deliver 16.8 million different colors. HDR ups this dramatically. The Ultra HD Premium standard increases it to 10-bits, while the competing Dolby Vision goes one better, using a 12-bits to store color information. HDR10 makes these displays theoretically capable of displaying up to one billion different colors, while Dolby Vision’s 12-bits increases this to a whopping 68 billion different colors.

How does this make the image look better? Well, the increase in possible colors means that each scene can have far more natural looking colors, with smoother gradients between each color that results in an image that seems more vibrant and three dimensional. It’s also referred to as Wide Gamut Color, or WCG, and currently is only readily available in the world of TVs. Just to confuse matters, the term WCG is also used in PC displays, but it’s not the same as HDR, with nowhere near as much color information. WCG displays have been in use in creative industries for some time now, such as video editing, CAD and other professions where image quality is of paramount importance, but HDR will offer an even bigger improvement to color reproduction.

HDR also introduces much higher brightness to displays, with the recommended specifications for a true HDR screen to have a maximum brightness of around 1000 cd/m2, or candela per square meter, which measures the luminance of a screen. Today’s displays average around the 350 cd/m2 level, so HDR in effect triples this. On the opposite end, HDR displays should also be able to reproduce very dark blacks, down to below 0.05cd/m2, which is much darker than the average offered by today’s screens. By increasing the range between black and white, HDR also delivers much better contrast performance, so that the detail in each scene should become far more noticeable, especially in darker scenes.

HDR Example – while they tend to look oversaturated in print examples, HDR displays deliver richer, more life-like colors. This color gamut chart shows how Ultra HD compares to regular TV when covering the color spectrum.

Unfortunately both Ultra HD Premium and Dolby Vision are currently only popular amongst TVs. However, with both Nvidia’s and AMD’s

latest range of products both supporting HDR, we can expect to see HDR PC displays boom in the next year. The good news is that it’s easy for software developers to implement HDR into games, so it’ll likely become a standard feature in games over the same period of time. In the meantime, the most common source of HDR content is via streaming media, such as Netflix and Amazon, who both have HDR films in their libraries. Note that both Ultra HD Premium and Dolby Vision also deliver 4K resolution 3840 x 2160, and that means a huge leap in data being transmitted between your GPU and display.

HDMI 2.0

To deliver the extra bandwidth required to enable HDR plus 4K resolution needs a speedier video connection, which is where HDMI 2.0 enters the picture. HDMI 2.0 delivers a heady 18 Gbps, a tripling of the bandwidth offered by HDMI 1.4, which was the previous version.

While HDMI 1.4 could handle 4K video, it would only do so at 30 frames per second. The new v2.0 doubles this to 60 frames per second, and also allows for the delivery of full 10 or 12-bit colour. PC owners have access to another connection that is even faster than HDMI 2.0, in the form of DisplayPort 1.4, which has a theoretical maximum throughput of 32.4 Gbps.

This will allow for 8K displays running at 60 frames per second, or 4K displays at 120 frames per second, but let’s not get ahead of ourselves, as it’s unlikely we’ll see displays capable of this performance for several years yet.

With the introduction of Ultra HD Premium Blu-Ray players, HDMI 2.0 has just received an upgrade to HDMI 2.0a, which first and foremost introduces new copy protection in the form of HDCP 2.2. Don’t fret if your display is only running HDMI 2.0 though, as most hardware should be firmware upgradable to HDMI 2.0a.

4K RESOLUTION

4K displays have been around on the PC for a while now, but it’s only recently that they’ve become affordable. It’s now possible to pick up a beautiful 28” 4K display, also equipped with G-Synch, for just over $600, in the form of Acer’s XB280HK display. Unlike TV viewers, PC gamers sit much closer to their displays, which is why 4K is seen to offer a bigger advantage to gamers, as the added pixels make a noticeable difference to the image quality. In TV-land, you need a set that is at least 70 inches across for 4K to make much of a difference to the viewing experience at the distance most viewers sit from their screen.

There’s still the issue of the GPU-power required to run in 4K, but today’s GeForce GTX 1080 and AMD’s R9 Fury X2 dual GPU card can deliver playable framerates in 4K in the latest graphically intense games… provided you turn down the detail levels a notch or two. Yet there’s another way to view 4K content on your new 4K display, via streaming media such as Netflix. There’s one huge catch though; to watch Netflix in 4K on your PC, you’ll need two things. First, it has to be played in Microsoft’s Edge browser, as it’s the only browser that offers the copy protection Netflix has embedded into its 4K streaming on PC. Secondly, you’re going to need one of Intel’s new 7th Generation CPUs, aka Kaby Lake, which hasn’t even hit the desktop yet, and is currently only available in a handful of laptops.

OLED

Organic Light-Emitting Diode (OLED) displays have been in use on smartphones for a while, but they’re still a rarity in PC-land. We’ve slowly seen more of these show up over 2016 in laptops, and we can expect the OLED revolution to transition more speedily to desktop displays in 2017. Their primary benefit is that each pixel emits its own light, and thus can control the brightness of each pixel. In contrast, today’s LCD screens instead rely on changing the brightness of their backlights, and this can only be done in a general region of the screen. As a result, OLED offers much better contrast ratios, as it’s possible to turn off the lighting for a pixel entirely, whereas backlit LCD screens can’t do this, so the blacks aren’t anywhere near as dark as an OLED. This also makes OLED the perfect partner for HDR.

THUNDERBOLT 3

Every motherboard manufacturer and their dog have been pushing the inclusion of these ports on this year’s latest motherboards, and one benefit is these high-speed interconnects is their ability to daisy chain several monitors at once. According to the official specifications, Thunderbolt 3 can handle One 4K display (4096 x 2160) 30-bit @ 120 Hz, One 5K display (5120 x 2880) 30-bit @ 60 Hz, or two 4K displays each (4096 x 2160) 30-bit @ 60 Hz. This sounds fantastic for owners of twin 4K panels, but there’s one slight issue.

As these ports are on the motherboard, they’re being powered by the integrated GPU on your CPU. As such, we don’t think either technology is very important for gamers when it comes to displays, and is aimed more at non-gamers who want to use 2D applications on multiple screens.

every-thunderbolt-3-cable-works-with-usb-3-type-c-connections-but-not-every-usb-3-type-connection-includes-thunderbolt-3-youll-need-a-high-speed-hdmi-cable-to-handle-the-added-bandwidth-of-hdmi-2-0

We will be happy to hear your thoughts

Leave a reply

Get more stuff like this
in your inbox

Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

Compare items
  • Total (0)
Compare
0