I don't use HDMI, and I never will

Blog

HomeHome / Blog / I don't use HDMI, and I never will

Jun 04, 2025

I don't use HDMI, and I never will

I've used a wide range of PC monitors over the years, from early monochrome CRTs to high-speed 480Hz OLEDs, utilizing a diverse array of connectors during that time. However, ever since the digital

I've used a wide range of PC monitors over the years, from early monochrome CRTs to high-speed 480Hz OLEDs, utilizing a diverse array of connectors during that time. However, ever since the digital switchover from VGA and DVI, I've used a single display cable standard exclusively for my desktop PCs and laptops. That's DisplayPort, whether over a full-size GPU port or DisplayPort Alt Mode via USB-C.

As to why, you wouldn't buy a sports car and put slow tires on it, would you? That's what plugging one of the best gaming monitors into any other cable feels like to me, because DisplayPort gives you the best bandwidth, the best connector type, and the best expanded feature set for PC use. I still use HDMI for my TV, because DisplayPort is conspicuously absent in the A/V space. However, that's because I use the best tools for each job, and DisplayPort is the best for PC.

Not sure whether you should go with DisplayPort or HDMI? Here's everything you need to know about the two display interfaces

I've been a DisplayPort user since my first custom gaming PC, but I almost wasn't. See, that yellow Nvidia GeForce GTX 970 up there was mine, painstakingly customized to fit a black and yellow build theme. To pair with it, only the finest 144Hz TN panel of the time would do, the Asus VG248QE.

However, I initially selected a larger Samsung monitor that only supported HDMI and had poor image quality. Now, the HDMI connection wasn't the cause of the bad image, it was just a terrible panel. However, HDMI would be an issue when the Asus monitor arrived, because to use every one of those 144Hz, it required DisplayPort or Dual-link DVI. DVI was on the way out, so I opted for DisplayPort because when your monitor costs as much as your GPU, you use the best connector available.

And it was worth every cent. Smooth frames, crisp visuals, and since my brain had yet to be warped irrevocably by 4K displays or ultrawides, I was the king of my couch. At the time, FreeSync and G-Sync were both relatively new, and both required DisplayPort to function, as it was not until later that any monitors supported it over HDMI. Again, as an avid gamer, you opt for the best you can get, and DisplayPort was that, whether I had an Nvidia or AMD GPU.

The most popular audio-visual interface just got better.

Ever since DisplayPort first came out, it's been outpacing HDMI for PC use. Whether it was better resolution support, higher bandwidth for 10-bit color and high refresh rates, variable frame rates, or other features, DisplayPort has long been my favorite since we moved to all-digital display chains. While HDMI 2.2 was just announced at CES, it'll be a while before it becomes available in any devices, so we currently only have HDMI 2.1 and DP 2.1 to compare.

And the winner in almost every category is DisplayPort. Don't get me wrong, HDMI is still a good option, and it has specific use cases that DisplayPort can't compete with, but for PC use, DisplayPort is the better choice for almost everyone.

Feature

HDMI 2.1

DisplayPort 2.1

Max resolution

4K@120 Hz, 8K@60 Hz

16K@60Hz, 8K@120Hz, 4K@240Hz

Max bandwidth

48 Gbps

77.37 Gbps (at UHBR20)

HDR support

Yes, dynamic

Yes, static

Audio support

eARC

Yes, up to 8 channels of 24-bit/192kHz

Number of displays (per port)

1

up to 4

Adaptive sync

FreeSync and G-Sync (both depend on the monitor)

FreeSync, VRR, G-Sync

Popular uses

Gaming consoles, A/V receivers

Gaming PCs, Thunderbolt4/USB4 mobile devices

DisplayPort is also more versatile for laptop users, even if there's an HDMI port on many laptops for projectors in conference rooms. DisplayPort Alternate Mode passes DP signals over USB4 or Thunderbolt 4 connectors, which are now commonly found on laptops and many desktop motherboards. That is now on par with large DisplayPort connectors, with a maximum bandwidth of 80 Gbps, enabling up to 16K video output. Not practical, sure, but HDMI doesn't work over USB-C and still needs the larger connector. And HDMI can't charge your laptop when connected, whereas USB-C can.

Honestly, as a PC user, all you need to know about HDMI is that it's better off used on your TV, where it makes more sense. Things like eARC were designed specifically for home theater use, and they're barely supported on PC, if at all. There's a good reason graphics card manufacturers usually put three DisplayPort and one HDMI port on their GPUs, except for the few that use two HDMI ports for virtual reality users, or the occasional USB-C port. It just makes more sense on PC to use DisplayPort, with its latching cables, better bandwidth, and PC-centric feature set.

It's here. It's brilliant. Go and buy one.

DisplayPort has been a fixture on my PCs since I started building them again a decade or so ago, and it'll be what I use for the next decade, and the one after that (unless something like GPMI replaces it). The DisplayPort consortium has a page dedicated to certified cables and devices, where I go every time I need new cables. I've found that uncertified ones are hit-or-miss with compatibility, but I've also encountered similar issues with HDMI. With gaming handhelds, mobile phones, and laptops all embracing DisplayPort Alt Mode, I now look for USB-C connectivity on new monitors.

We want to hear from you! Share your opinions in the thread below and remember to keep it respectful.

Your comment has not been saved

First, DVI is digital. Second, the word is "disdain," not "distain."

Maybe journalism isn't authors best career choice.

Also the "woudn't" ... apparently DisplayPort doesn't accommodate Spell-Check.

I went from interested in their opinion to immediately disregarding it and instead ran to the comments in hopes that it wasn't just me.

Tekka Blogging has never been journalism no matter how much they want us to believe it is

Yeah. Journalism at its best 🥱😅

So first off, you can't tell the difference between formats and are just bloviating. Do you have that 8k gaming screen where you might notice the difference? You gaming with more than 9 channels of sound?

Secondly the HDMI standard is 5 years older than Displayport, mostly because it didn't need upgrading. But right now they have HDMI 2.2 available with 96gb/s vs only 77 for DP. Are you gonna claim the slower DP is now better?

USB-C seems like an entirely different beast with its own set of benefits and problems.

This!!!!

It would be exceedingly difficult to find a more accurate way to say what you said. Bonus points for using the term 'bloviating'. 😂

usb 3.1 / 3.2 / 4 (usb-c is the connector & usb 3.1 / 3.2 / 4 are the protocols) does not guarantee video connectors will work, while you might hear its a native feature, I personally wouldn't even consider it a features because displayport alt mode is just connecting some of the usb-c pins direct to a video chip so you can use it as a video output, its the same with a lot of the "features" usb-c can do, like external video cards, they require the usb-c to be attached to PCIe lanes, not all devices support that, so not all devices can use external video cards like that.

Lets also take into consideration that displayport has 20 wires in it, usb-c has 24, and while some pins in both are just for power, if you read on the displayport alt mode it explains that it repurposes some of the usb-c pins and connects them to the video chip, now I will admit that I do not completely know how this all works, but to me that means that you aren't getting the full amount of data that display port is capable of since the usb-c connectors still needs connections for the rest of the devices it can be connected through the same port so it can't have all 20 pins that display port usually has. But this is just an observation, as I do not completely understand the whole way it all works, I am only stating what seems plausible to me with my understanding of it.

So to me the writer talking about using USB- to Displayport seems weird vs just using HDMI, and lets also talk about the fact that humans generally can't see past 60 fps, and while there are some individuals that researchers have noticed may be able to see higher then 60 fps, generally most can't, so if you want better gaming, limit your game to 60 FPS and you will get better performances needing less bandwidth, making HDMI perfectly fine, granted, HDMI is perfectly fine for almost everyone since most people can't afford the prices for equipment that display 4k, and most people still game at 1080P followed by 1440P

Riot2212

Im by no means an expert in AV, but I do game with a high refresh monitor for at least 15 years or more.

"Better gaming" is not about 'seeing' above 60fps persay, but motion fluidity is a thing and at insanely high refresh rate the underlying benefit is latency reduction.

Higher refresh rate, is about 'feeling', i.e. latency and response time from Input devices.

There's absolutely no way id ever purposely go back or limit myself to 60fps which to this day, FGC (e.g. Fighting games like Tekken, Street Fighter, SNK etc,) absolutely do for competitive, animation and balance purposes.

Id say around 144-180fps, I can't 'feel'/see much of a difference anymore. Again, input latency/responsiveness is more important at high refresh rates and the higher frame rates.

30/60fps/hz container is excellent for video recording and/or and when frametime from game engine is erratic and we'll above 33mm/16MM or above spikes. Locking fps to a lower/stable framerate can be the way to go. (UE5 games, im looking at you.)

Literally noones cares what you use, do you want a cookie?

DP 2.1 is only available on 50 series GPUs...DP 1.4 is the current standard for 99% and should have been the comparison.

This. They've chosen the new DP to compare against the old HDMI. That's like comparing a 5 year old family car to the newest updated family car.

Yeah I went through this trying to get 4k@60 working. DisplayPort couldn't do it at the time, except maybe if you tunneled an HDMI 2 signal over it. Just because DisplayPort is so limited it can only handle video doesn't make it a better tool. That's like arguing that FireWire is superior to USB-C for cameras.

This place has been going downhill lately with these elitist and smooth-brained articles. Just one more and bye-bye from my Google News feed.

I immediately went to the comment section to check if it's only me with the same impression.

It's sad what happened to the XDA

Absolutely right, XDA used to be my go-to for useful articles when i was still learning PC building and software tips n tricks.

But GOD DAYUM are they annoying with anything and everything they post, literally nobody reads the whole post and just goes straight to comments to see if others are annoyed by the same thing.

A big shame is that the comment section on XDA posts is one of the most relatable comment sections i've ever seen, but unfortunately this is it - goodbye XDA "community", this media is going away from my news feed.

Don't forget a dozen articles a week about prompt engineering.

"distain"

It's amazing how far XDA has fallen from the days of burning your own Android ROMs. Now we get articles like this, written by kids who may have been in diapers when the first Android phones came out.

Sad but very true statement. iPad kids are among us now and it's scary. PE ruines everything.

I think I know why.

Some digging revealed that XDA was bought by Valnet back in 2022, and Valnet has a reputation for turning their websites into clickbait factories. They have a very diverse portfolio to say the least, and from what I've seen most of them have gone downhill over the past few years.

The author doesn't know 5000 series only 2.1 currently.

Second have you heard of LG G5 75inch , only HDMI can be used for such glorious monitors.

Size absolutely matters.

Yeah when with TCL qm851g 75 inch last year. Don't get me wrong oled is gorgeous but I play games for hours and everyone I know that play a lots like me ended up with burning on it :(

Dan & Eric I've had 2 OLED TV's, a LG 55C7OLED & now a 65C2OLED, which I only upgraded to for the VRR, 120hz and ALLM for the PS5 and even the C7 which is god knows how old now have burn in, in fact it is still going strong as I gave it to my mum. Also both screens are/were also used heavily outside of gaming for motorsports with static overlays.

Honestly don't know what you guys and your friends are doing to get burn in that consistently as I'm a member of a AV forum and even there the number of people getting burn in is minimal🤔

Have one, got burnin on it from playing same game on ps5. Wish I went led now, something they don't tell you about OLED.

I've been using OLED exclusively since 2016s LG B6 and have added to my collection of OLED since with the 2019 LG C9, 2022 LG G2, 2023 LG C3 and 2023 Samsung s90c...

I have a MASSIVE amount of experience with OLEDs of all types levels and age...

With that said I can tell you while burn in WAS definitely a thing you USED to have to worry about on today's latest OLED displays it's almost a non issue for 99.5% of users in majority of scenarios.

You have to go out of your way these days to really abuse your display to get burn in and it's why in the early days manufacturers wouldn't cover burn in via warranty but many today absolutely do.

They cover it now because they know it's very much unlikely to happen for the vast majority of users.

I probably use my displays in some of the most abusive ways imaginable, including hundreds to thousands of hours of the same game on the same display many times back to back to back non-stop for weeks if not months... And while this abusive use did in fact give me a very slight bit of burn in on my very first OLED the 2016 LG B6... It happened within the first 3 months of ownership and it happened because of my purposeful disabling of protections and abusive use of playing the same game basically 18 hours a day, 7 days a week for that entire 3 months.

I've done similar usage on the more modern displays since and put at minimum 5,000 hours on each of my displays, none of which had developed even a hint of burn in.

Whichever display is my current personal main display gets used practically 24/7 and when it's not being used to game it's playing content usually on YouTube many times of which during the night will freeze and display the same image non-stop for 8 hours until I wake and fix the YouTube Frozen issue.

Again, none of this extremely abusive behavior has led to any burn-in on any of my displays that are 2019 or newer.

Basically burn in is a non-issue at this point people need to stop for fear mongering.

Really we are at this level now? Also isn't HDMI the same tech as DP, plus some patented DRM protocols? Which yes makes DP the friendlier cable, but come on!

The lack of quality in this piece made me think it was a Tanveer article but nope

Two of my three monitors came from the thrift shop and cost less than a single DP cable.

I love my DVI to HDMI to DP monstrosity.

I'm running HDMI from my PC to TV, 4k 144hz HDR, no issue.... So that 120hz limit isn't right.

Hdmi 2.1 at 48 gbps has enough bandwidth for 4k 144hz at 10bit whitout using DSC.

At 10-bit? Or is it using DSC?

@the It's possible without display stream compression... I know my Samsung s90c is doing it right now though it did require that I go in and manually edit a few settings in the TV service menu as well as add it to the EDID.

But once done it works great and not all 144hz 4k displays via HDMI require even this.

I'm gaming 4k 160hz g-sync enable over hdmi 2.1

What the **** is this article

Do some research first

Which display are you doing this with and does that require the use of Display stream compression?

Title : I dont use HDMI and i never will

6 sentences later ; i still use HDMI for my tv

Clickbait article

One ought to argue ragebait article.

Your justification is so ridiculous, i would have just kept this to yourself. "Better to be silent and thought an idiot than open your mouth and remove all doubt" territory with this one....

good for you but reality is if you wanted 4k OLED + 120hz with good HDR, HDMI was the only option for quite long time. it's still pretty much viable option if you want big screen without overpaying for few dedicated PC monitors and get TV instead with higher brightness and dynamic HDR on top.

HDR: Dynamic vs Static. This is why TVs use hdmi (not the only reason). Ultimately, a monitor is to display. Comparing any other feature is pointless. One has full dynamic HDR, one does not. That is huge. Everything else is secondary to picture on a display.

Once you start getting resolutions of 1440p to 4k and are still sitting fairly close to your display, like 90% of PC gamers, yes. Absolutely. HDR is far better to have than even higher resolution.

OP isn't even right. DP 1.4a and newer support HDR10plus (dynamic)

XDAMember

You're using a straw man fallacy there.

Member86 was making a point about the significance of dynamic HDR for picture quality on a display, not claiming it's the only thing that matters or that it outweighs every other display spec to such an absurd degree. Your example of a "320x200 display at 87Hz interlaced" twists their argument into something it's not.

HDR is more important than resolution? Refresh rate? Frame sync? So a 320×200 display at 87Hz interlaced, with every frame showing half of one image and half of the next is better than 8k@120Hz as long as the blacks are blacker?

Nah, I use my 5o inch display as a monitor, it only has hdmi so that

HDMI 2.1b is also available. HDMI 2.1b does supports VRR and will go up to 144hz refresh rate. Please get the most up to date information next time.

Thank you.

I like audio and video thru one cable. Two signals, one plug

So... my 4K TV supports VRR over HDMI. Also arguing that higher resolution and frame rates are objectively better is silly. We're likely never going to see 16K gaming achievable amongst AAA games. And only enthusiasts care about super high frame rates. For most people 120 hz is probably suitable.

100% pretty happy with my pc plug to a good 4k 144hz tcl qm851g. Hell even with a 4080 super its hard to run 4k games maxed out that will fill a 4k 120hz..

Low effort rage bait. If you can't figure out how to use spell check you have nothing worth saying.

Games journalism isn't what it used to be.

Blogging isn't journalism. Never had been

Holy crap this whole thing is cringe, and I can only assume the author went all in for Monster cable in his home theater system.

It's digital, it either works or it doesn't , and the bandwidth is enough or not

As always, RTFM.

MANY MANY monitors only support their high refresh rates via the HDMI port. My MSI OLED only gets 165hz when connected via HDMI. The refresh rate is confirmed in both windows and Nvidia settings, as well as in Special K's readouts. 🤷‍♀

Display port also passes RS232 signalling as well, great for video walls.

(I've also got an hdmi adapter that works fine with usb-c for input testing the above as well👍)

Could the best move be a lateral one?

You might want to hold off hitting that update button

A step-by-step guide to get you started with the Android Debug Bridge tool.

Chasing the latest GPUs doesn't make sense financially

I just don't see the point in paying for a NAS-centric operating system

WSL was my gateway to jumping into Linux with both feet.