HDMI stands for high-definition multimedia interface. Since it has HD in the name, it should be easy for anyone to guess when it was first introduced and implemented, raising 'HD' on the flag is so early 2000s. HDMI is a digital interface that was and still is competing with DisplayPort. The crucial thing in this standoff was that DisplayPort was always meant for computers and all the alike. At the same time, HDMI was developed by a consortium of consumer tech manufacturers, including TV makers. Historically, this meant a much greater spread of the technology, so any new type of device should have HDMI if it supports an external display for any purpose.
Just to keep this explanation fair, DisplayPort still has one undoubtedly significant advantage—native support for Thunderbolt USB-C. Thunderbolt and USB-C are complex standards, so I won't dive deep into this. It should be noted that if you have a fairly recently produced laptop computer with USB-C, then connecting it with a USB-C to DisplayPort cable is simpler compared to a USB-C to HDMI converter. This makes no difference to 99% of users, but an unnecessary converter can bother some.
All that aside, back to the topic. HDMI has gone through several versions. Usable (or, should I say, meaningful) ones right now are 1.4b, 2.0, and 2.1. This hot take and all the following could be technically incorrect, but are a good rule of thumb to evaluate cables and devices:
HDMI 1.4b is a version with a 10 Gb/s transmission rate, used for FullHD (and up to 1440p with some limitations) and basic additional functions like HDCP, ARC, Ethernet, and CEC.
HDCP is a content protection protocol used from the beginning of HDMI and is closely tied to it. In simple terms, it is here to ensure that the system's endpoint is indeed a monitor and not streaming or recording equipment. Suppose one tries to stream an HDCP-protected Netflix show from a laptop to a streaming box with no HDCP support (like some enterprise screen-sharing solutions). In that case, the image on the screen will be just black.
ARC stands for 'audio return channel'; video and audio signals are sent to the TV, video is shown, and audio is transmitted back through the same HDMI cable to be played on the player's, console, or PC side. The easy way to play sound on some standalone audio system instead of TV speakers is, of course, a trusted audio jack output on the TV. Using ARC is the right way to do it with an amp or an audio receiver near the media player. HDMI audio could be multi-channel, but the one-eight-inch stereo port is limited to just two channels.
Ethernet is an excellent addition to video and was added to the standard relatively early. It can be implemented on a single twisted pair inside an HDMI cable. Ethernet over HDMI is neither fast nor widespread, so the idea of ditching ARC and CEC (those wires are used for Ethernet) did not take off.
CEC is a basic, unified list of commands for consumer electronics, like 'on', 'off', 'switch to input 1', etc. CEC control is a type of protocol that requires no input from the user, and due to its simplicity, it is quietly implemented in most consumer media players. When the streaming starts, any Chromecast-like device does not hesitate to send out a few bytes to the TV to wake it from standby and switch to the appropriate input.
HDMI 2.0 is the one with 18 Gb/s throughput. This version can deliver 4K 30 frames per second with no color compression or up to 4K 60 fps with slightly reduced color space and no HDR. So if you plan on using a budget-friendly TV or a monitor—not a crazy bright one, not connected to a top-of-the-line gaming rig capable of outputting 4K at 120+ fps—then HDMI 2.0 transmitters, cables, and switches are fine. But if you do, then the next version is for you. Additional functions are static HDR (for 2.0a, dynamic for 2.0b), better and more capable CEC, and ARC.
'No color compression' means using the full 4:4:4 YCbCr color space. Number sequences 4:2:0 and 4:2:2 represent different levels of compression: the same level of luminance and less accurate colors are made so that the average consumer does not see the difference. Certain edge cases can become apparent, especially when displaying color gradients. 4:2:2 is primarily fine for watching any content or playing games; working with text and numbers needs 4:4:4 as all the fonts are fine-tuned to be displayed with no compression and require every pixel to be exactly right in terms of color and intensity to look good.
HDR stands for 'high dynamic range' and is a widely marketed feature. HDR means more light in general and more luminance levels for cinematic night shots, sunsets, and fire.
HDMI 2.1—more than 40 Gb/s of data for 4K and 5K resolutions with high frame rates and no limitations, or up to the still rarely used 8K. As the resolution is still 4K, as in the previous version, added HDMI functions are centered around supporting variable frame rate (VRR) and improving color accuracy and sound quality.
HDMI 2.1 takes up to 40-48 Gb/s. It has all the bells and whistles imaginable; the latest version came out in August 2023. In terms of comparison, HDMI 2.1 supports 4K and 5K (up to 10K, but it's expensive to test) with high frame rates, HDR, and many audio channels transmitted forward and backward. So, for now, it should be considered future-proofing the video system to be used long after purchase to its fullest.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.