Our Brave New World of 4K Displays

Codinghorror, did you just buy 3 monitors for 2100$?

What video card(s) suggested for photography use?
Thanks.

“and trust that everyone is getting their collective act together by now on software support for high DPI.”

HA HA HA, oh, my, you are so funny!

Seriously, if you care deeply about this issue, switch to OS X.

3840×2160 ? how can you do anything with such resolution? i tried on pa328q so a 32" but the text is too small. I wonder what is the best resolution in such case? Also which machine are you using?

Well, you could go for one of these 24" or 30"-ish monitors, or you could go all the way and use a 4K 40" PC monitor with no scaling: http://www.amazon.co.uk/gp/product/B00OO9YWR0 (and it’s not even that expensive!)

Come on, you know you want to.

And another $700 for a modular monitor mount from Herman Miller. :smile:

What is the pixel response time on these like though? That ends up mattering for the overall perceived refresh rate (which could be lower than 60fps) and ghosting on fast-changing screens, like you may see in a first person shooter. The specs list 5ms gray-to-gray response time, but don’t get into the specifics. That and input lag are a little subjective, I admit, but we have someone to give us a first person account here!

So that’s ($698.99 * 3) + $679.99 = $2776.96 + tax + shipping. In USD. So right now, for me to buy those it’d cost $3626.71 ( the Canadian dollar sucks right now ).

Got any recommendations for folks that don’t want to spend more on their monitors and graphics card than they did on their entire computer ( by about 4x )? I’ve got a 16GB RAM, AMD A4-5300 desktop at home with a Radeon HD 7770. Add in the hard drives ( SSD & a regular HD ) and a few other things, total for the entire thing was about $700 CAD. I’ve got two monitors right now, an 1920x1080 and one that’s a slightly lower resolution ( can’t remember specifics right now ). I don’t think they cost me more than $300.

I’d love to have three UHD monitors and the GPU to back them up – but there’s no way I could justify spending almost 4x what I spent on my desktop in order to get those monitors and graphics card.

I think we can stop now. 8K could make sense for 60" screen in front of your couch, but for a 27" screen two feet from your nose? C’mon…

Personally I’m still waiting a while before I buy 4k. While it’s getting into the mainstream now, it’s still only on the high end builds. If I don’t look at a 4k screen for another year or so I can convince myself I don’t know the difference.

I have a Seiki 39" TV via HDMI. Not Retinal, but I use all the realestate because of that. It was $339 from NewEgg last I checked but I got mine from Amazon at a similar price. Some hiccups related to the TV, but no messy bezels

The scale value you want to use depends primarily on the size of the monitor. For a 24" 4K monitor like mine, Windows defaults to 150%, which works well. It is also definitely “real retina”, not that “retina” is anything but an Apple marketing term and so poorly defined that it really isn’t useful.

MST is actually a better option in many cases, as it works with more devices (doesn’t require as high of a pixel clock from the CPU/GPU to do 60hz). SST has some advantages and will eventually be the better option, but right now on some devices (like my Surface Pro 3) those are limited to 50hz, and even then only with some futzing with driver settings it seems.

Windows 10 actually does basically as well as can be expected with two different DPI scales, and Windows has supported different scale settings per monitor since 8.1. Last I knew, and contrary to your claims, Mac OS X didn’t support different DPI scales on different monitors at all. Hell, it wasn’t until Yosemite that my Mac could even output to my 4k monitor without horrible screen artifacts…

On Win10, the main caveat with multiple DPI scales across monitors is that if you drag a window that doesn’t support dynamic DPI scale adjustments from one monitor to another, it will get scaled by the compositor. It will only be rendering at the native DPI scale on the monitor it was launched on. Modern apps (like all those from the Store or built using WinRT, and most inbox ones) support dynamic DPI scale changes so this isn’t a problem with them. Most legacy apps do not. The OS is really doing all it can there, and the situation is the same on OS X.

Actually as you can see in that thread, you can do 4k at 50hz from the SP3 to an SST monitor, but it may take some extra steps depending on driver version. MST monitors (like the UP2414Q) will do 60hz over it, but sadly not without some disadvantages.

The main problem right now is that Intel regressed their MST support in their Win10 drivers, so I run with the Win8 ones for now. The Win10 driver has glitches on MST monitors which I reported to them 7 or 8 months ago, and they have yet to resolve.

Your article is very weak and full of outdated crap. But I had to completely stop reading at your inane definition of proper viewing distance.
An arm length? Seriously? What is this, the dark ages?

I have this monitor for about a year now as a second monitor on a Retina iMac.

Apart from the color fidelity that is slightly poorer (but for my non-professional usage, the issue is ignorable) I have only 1 complaint with it and it is a minor one.

It just takes too long to switch out of standby mode. When display goes to sleep, it requires about 2-3 seconds ot come online, while the Retina iMac’s main display is near instantaneous. This causes ugly flickering that as windows are moved back and forth between screens as the OS detects the new screen being “added” again. So, you have to wait 3-4 seconds after each wake up before you can start using it.

So, if you are mixing and matching displays, consider finding one that starts up faster. Maybe in the last year there is something on the market that is faster at the same price point. If this is your only display on the machine, then, it won’t be an issue.

1 Like

Sidestepping the trolling, how far away is your work monitor from your body?

Mine is certainly armlength.

1 Like

How does this monitor setup perform with remote desktop?

For coding, I like to remote into my laptop from a desktop computer with multiple monitors and tick of “Use all my monitors for the remote session”. It’s a simple setup where I get to use multiple monitors on the laptop, without using a docking station.

So how many of these 4k monitors can remote desktop handle, and what degradation do you see compared to direct connection?

This post is very misleading. 4K is nowhere near becoming mainstream and nowhere near being actually a worthy investment, especially at this price point. And then, you say 4K, but really its not, its just pre-real-4K crap that you should stay away from (4K is 4096x2160 NOT 3840x2160).

Before getting started I would like to precise I’m only talking about IPS (and variants) screens, I don’t think anyone should buy a TN screen nowadays (which is what cheap 4K tends to be).

So in buying one of theses here is what you get

  • premium price because you are too early to the party
  • a technology still in early state that still has issues
  • the need for a beast of a GPU without actually being able to benefit from it that much (more pixel, but then again, if you need to lower some settings to get decent fps, pixels are worthless)
  • not much content that will use this resolution (whats more, 1080p
    streaming still isn’t possible for everyone) and even when it will get
    around the bandwidth needed is way over most internet connection
    (fibre isn’t there yet, come again in at least 5 years, 10 years is
    more likely).
  • software support that is still “hit and miss”
  • no real estate gain unless you get at least 40" (which is gigantic and way overpriced)
  • an actual quality improvement for text that is not that convincing (I find it disappointing once you tried).

I would urge anyone willing to jump on the 4k bandwagon to try it beforehand, because it is probably not worth it unless you have some special need.

I built myself a computer last month and I was getting a new screen as well. I had a 1440p screen and of course wanted to jump to 4K. My build already had a GTX 980Ti, so GPU wasn’t an issue.

So I tried both and after some testing it became clear it wasnt worth it at all. GPU was running hotter all the time, games weren’t running that well. You get 35-something FPS on the latest GPU hungry games, which is not OK and if you lower some graphics quality related settings you tend to notice more than the pixels make up for).

And actually, for movies and games, the difference between 4K and 1440p at this screen size is barely visible (good luck finding 4K movies in the first place). You actually need some time to tell one from the other ; if you would show it to a non tech-savy person, he probably wouldn’t tell the difference before you explained to him (explanation was needed for my mum and 2 friends, so…).
It has already been proved that people cant really tell the difference between 1080p upscaled and true 4K when watching movies so I guess its no surprise.

To me, the difference between 4k and 1440p feels like the difference between thunderbolt and usb3. Sure, thunderbolt is technically better, it allows much more, it is technically superior, it is faster, but for 99% of people it doesn’t bring enough to the table to warrant the price difference.

Some will tell they can totally see the difference and it is totally necessary for them, just like some claim they can totally see the difference past 60fps when it has been proved that after ~50fps you cant really tell much if the FPS is stable or just like some claim they can totally hear the difference between lossless and a 256kpbs MP4 when they really cant.

Since you did prove that last point on this very website, I’m a bit disappointed by this post ; I find it very misleading for someone who is really searching for answers like I did before trying out myself.

In conclusion, anyone looking to buy a new screen should probably stick with 1440p, they will get much more for the money. My 500$ 1440p screens has so much more stuff than the 700$ 4k I sent back that its not even funny ; USB3 hub, audio hub, 10bits support, nice mounting stand with all kind of adjustment. At this price point there is no 4K screen (even “false” 4k) that has any of it and those things most likely matters more to most people in the long run.

So unless you can shell out at least a thousand bucks for a LG 31MU97 you should probably stick to 1440p or go for Ultra-wide if you are in need for screen estate (and dont want to go for multiple screens wich is a productivity killer contrary to popular belief).

But better yet, you should probably wait for full HDMI 2.0 support, Displayport 1.3 support, mainstream G-Sync/FreeSync support and GPUs that can run everything at this resolution without needing to lower settings or suffer low FPS without requiring multiple GPUs (gonna take at least one or 2 years).
But really, the real deal will come with 5k which is to 4k what 1440p is to 1080p ; in the meantime you money is probably better used elsewhere.

If you do have a lot of money, by all means, go for it, they are technically better, you are just going against the Pareto principle, using a lot of money to get those last 20 percent…

Clement, while you’re right that 4K is ridden with problems (as I’ve explained earlier) you’re wrong about FPS: people can discern even above 120Hz. Also there is still a huge difference between 50 and 60. The quality you can perceive depends on your eyes, but there are many reasons to this. As an owner of, in parallel, multiple displays of various speeds, I can tell you there is a significant difference that is more important the faster action is.

5K is already here (I own one and owned 2 so far), but like all MST implementations, it has its share of issues.

1 Like

Yes, but only the very latest Skylake integrated GPU supports 3 displays at 4K and 60hz. I am not sure Haswell GPU even supports a single 4k at 60hz!

Are they TN displays? Every pixel on a TN display is a bad pixel. It’s worth extra for IPS.

I think 60fps constant is achievable in most games with a nice enough video card, and/or reduced settings. Since gsync only matters when the framerate deviates a lot from 60fps:

Frame rendering times tend to vary, though, as we’ve noted. As a result, even with some buffering, the system may not have a frame ready at the start of each new refresh cycle. If there’s no new frame to be displayed when it’s time to paint the screen, the fallback option is to show the preceding frame once again and to wait for the next refresh cycle before flipping to a new one.

This wait for the next refresh interval drops the effective frame rate. The usual refresh interval for a 60Hz display is 16.7 milliseconds. Turn in a frame at every interval, and you’re gaming at a steady 60 FPS. If a frame takes 16.9 milliseconds to render—and is just 0.2 ms late to the party—it will have to wait the remaining 16.5 ms of the current interval before being displayed. The total wait time for a new frame, then, will be 33.3 ms—the equivalent of 30 FPS.

Wouldn’t you be better off with a 49" 4K TV, like @ofthetimelords suggested? Probably cheaper too.

Yes! Start with a single 4K monitor, then add another later, and another. As long as your current video card supports 4K at 60Hz you can keep it and upgrade later when you get your 2nd or 3rd 4K monitor.

@Clement_Collier the monitor I am recommending is IPS. You also don’t need that expensive of a video card unless you plan to game, or do triple monitors.

Sure, future 5K models will be a bit better, but the resolution difference of 4K at 27" is quite apparent with text, and I stare at text all day long. Like I am right now!