I Almost Bought a 4K Monitor

Returns diminished

Alex Rowe
5 min readApr 21, 2019

Over the weekend, I was wandering around in my local Best Buy, and I noticed that they had the new LG 27UL600 gaming monitor in stock.

It’s an update of last year’s “K” model but with the second letter changed to “L,” and added support for the VESA HDR400 standard in addition to HDR10. It goes for a little over $400, and it has everything I’d probably want out of a 4K monitor for my home desktop setup. Full 4K support at 60Hz, two display port and two HDMI inputs, and a nice-looking implementation of basic HDR support.

In fact, even on the bog standard out-of-the-box settings on the shelf at Best Buy, the improvement in brightness and contrast is obvious compared to their other monitors.

I’m still running a 1080p display in my main desktop setup at home. I have more pixels in my MacBook and my cell phone than on my home monitor. It’s a 27 inch Asus IPS model with a decent response time and…interlaced 3D. That’s right, I’m rocking a passive 3D monitor at home every single day.

Before you ask, no, I haven’t touched the 3D support since the first six months I owned the monitor. And that was several years ago now. I used it to watch some 3D Blu-Rays and play a handful of PS3(!) games that had full 3D support, and that was it.

I’ve known for a while now that I’d probably need to upgrade it some day…but I’ve had a really hard time getting excited about 4K.

4K is the literal point of diminishing returns as far as resolution goes. That’s both my personal opinion, and my opinion as a one time official film/video person.

At one time, folks believed that 4K resolutions had enough pixels to adequately scan all the detail out of a film print, but now we’ve got 8K and 10K masters. Still, in home environments and at typical display viewing distances, it’s really hard to see a dramatic difference between 4K and 1080p. 4K is even enough pixels to project on a large screen.

In a computer desktop environment, it’s hard to argue for its benefits.

I’m not saying that there’s no difference. Higher pixel counts and densities have a positive impact on image quality and there’s still a little improvement to be had. But 4K also requires a massive increase in processing power for real-time game graphics at high framerates, and upgrading to a good 4K display setup is kind of like guaranteeing you’ll have to make more technology purchases in the future.

HDR is a much more noticeable upgrade, and the lacking HDR support in desktop monitors is the bigger reason I haven’t jumped up yet. But this new LG monitor is finally right where I’d want it to be, and I have both a console and a PC that could push HDR content to this display.

Unfortunately, just like with old 3D implementations, HDR is heavily reliant on the quality of the encoding. Whether it’s a film or a video game, HDR implementation can vary wildly in quality. Much like how 3D films looked much better when shot in 3D as opposed to being cheaply converted in post production, HDR is something that game designers need to plan around, and film colorists need to have a good grasp of when mastering movies for the home HDR formats.

Ubisoft has received a large amount of praise for their HDR implementations, but at launch, Rockstar’s Red Dead Redemption 2 was shunned for having basically no “real” HDR support at all.

Once you’re in the world of HDR, it’s not a simple matter of turning it on and having things look better.

You’ve now got to watch out for good implementations and bad implementations. And there are also a variety of competing formats, like Dolby Vision and HDR10, and also a number of different brightness levels that displays can offer support for. The 400nit level that the new LG monitor offers is actually considered the lowest brightness that a true HDR display should have.

This inconsistency is the same thing that put me off of stereoscopic 3D for movies and video games. 3D never quite penetrated the market enough for the base level of quality in even the cheapest content to come up to an acceptable standard.

If I’m going to spend money on a display upgrade, I want everything to look better. I don’t want to have to worry about badly performing 4K games or shoddy HDR implementations. But that’s the weird quality middle-ground we’re still living in, and although it’s something that’ll improve over time, it’s still really hard for me to get excited about this stuff right at this moment.

Most of my screen time is in a desktop or laptop computer-style environment. TV’s have had much faster progression in the 4K and HDR realms this time around than computer monitors, which is a bummer. I don’t want to convert to a living room setup just to get access to the latest HDR standards.

My current monitor works well and offers enough pixels to provide a good gaming and computing experience. When I bought it, I was stepping up to an IPS panel and getting access to the wacky and now mostly-defunct world of 3D. I want at least that same level of “newness” in my next monitor purchase.

Some of you might be shouting “Why not buy a high-refresh rate 1080p display?” I’ve seriously considered that. But I’m reasonably happy with 60FPS gaming and I’m not trying to enter the competitive realm that faster refresh rates and response times would help with. It’s at least an option in the back of my mind.

HDR is, in theory, the big feature that I want from a new monitor, and 4K is just a nice perk that will make text as smooth as it is on my MacBook. But until the standards war and implementation nightmare are sorted out, it’s hard for me to personally justify spending money on. Half the games and movies I buy don’t even offer HDR support, and the ones that do weren’t all treated with the same level of care in their conversions.

In the rush to push more and more pixels, I feel like we’ve lost a little something. We haven’t even mastered 1080p 60Hz gaming yet as a base standard of performance, and yet gaming companies and manufacturers are already pushing hard towards 4K being a standard and 8K being the future.

Are we losing access to better effects at higher framerates in the push towards higher pixel counts above all else? Are movie studios going to spend the time to properly convert every catalog title to HDR or just slap on basic filters? When 8K rolls around in full force in a couple of years, will anyone even have the hardware to fully support it without upscaling? As streaming services become the new norm for gaming and movie content, is having more and more pixels really a big deal?

Sometimes, enough is enough. Sometimes the underlying art has to be more important than the number of pixels in the display. I know that display manufacturers have to keep selling displays, but I feel like this push is in real danger of holding us back instead of pushing entertainment forward.

--

--

Alex Rowe
Alex Rowe

Written by Alex Rowe

I write about gaming, tech, music, and their industries. Audio producer, video editor, and former magazine game critic. Look mom, I’m using my English degree!