Our Brave New World of 4K Displays

Since I did not have the experience of 60fps+ long enough to be sure, I wont try to argue too much. Let me just say that I am sceptic, a lot… The short try I had on a 120hz screen didnt leave me with an impression of missing anything (with a 60hz in comparison).
To me it feels a lot like the Hi-Fi guys who claimed they could definitely tell the difference between compressed audio (with a decent algo and decent settings) and audio at CD quality or more. This has been debunked with scientific A/B testing quite a few times by now, leaving the believers in the dust with not many arguments left.
I also remember reading a paper demonstrating that people couldn’t tell the difference after around 50fps ; if I find it back I’ll forward.

But even if we could discern, there is still some fundamental issues making the purchase not really worthy : what about the GPU. Even my GTX 980Ti can’t output that kind of FPS on the latest AAA games only at 1440p. Since we are talking about movement for this to matter, it is really only useful for games therefore I don’t see how one can justify that need when you can’t fulfill it to begin with. Unless you go 2-3-4 SLI, but at this price point if one has that kind of money to put in a gaming machine I don’t think there is a need for a rational justification.
For movies, I think The hobbit showed to the world how little more than 30fps bring to the table (nothing…) and for everything else, well 60fps is just fine really.

In my opinion, this is just beliefs but I would be happy to be proven wrong with a scientific testing, in the meantime I’ll keep believing what my experience tell me in opposition to what other people would like me to believe. Everyone is entitled to his opinion I guess.

For 5k, saying it is here is taking quite a shortcut ; not only the pricing put it out of reach of most peoples, but there is still the fundamental issues of software support and GPU need. The 5k iMac already has issues running Mission control without lags. Sure it does not have the best GPU out there but we are talking about a GUI, not a AAA game. Those simply do not run at this resolution unless you put everything at mean which completely defeat the purpose of having a gaming machine in the first place (and high resolution gaming do not bring that much to the table in the first place).

I think it will take way more time to get a decent percentage of people running these screens than you believe (5 years+ at least). Currently this is less than one percent of the Steam user base so I wouldn’t hold my breath.

But I guess this is the year of the 4K hype.

Sensitivity to speed (let’s make it FPS and Hz for a perfect situation) is dropping as you get higher. You’ll always see 30->60 and it will look like a much bigger change than 60 to 144, although difference is actually much smaller. As we get up the ladder it is less visible, but - from theory (medical) and practice it is visible.
It, of course, highly depends on content. Most people see the difference on desktop: mouse movement, scrolling, animations for windows and menus, etc. Of course, for non gamers, benefits are very small since in desktop work you don’t really gain much from having nice smooth menu animations. :slight_smile:

However. When gaming, especially in fast paced games like simulations, FPS and platformers (strategies not so much) you can have a significant benefit of ULMB functions (syncing refresh with framerate and backlight, makes for motion blur free experience) and even sync of framerate and refresh.
Yes. You need a beefy machine. I use TitanX, but as you’ve said, that is above what mainstream users can afford.
5K is here technically. Has problems due to MST and yes you need hardware to run it. Of course, you need 4 TitanX to run Witcher 3 maxed, but you only need one to run Diablo 3 or StarCraft 2 above 60 and more.

For general desktop work, there are no performance penalties. The very same TitanX renders Windows, Linux and Mac OSX desktop fluid (I have them all installed).
Benefits are huge: at 200% scale you get 2560x1440 screen real-estate, but at double the quality. As I’ve said before, for looking at code it is a phenomenal experience. All supported apps and web look amazingly good and that is something you’ll be hard to give up once you try it.

Fairly cheap dual GTX 970 setup can run high resolutions “fine”:

Weird, I got an e-mail about a (hostile and very misinformed) reply from @Tonci_Jukic but don’t see it here. Guessing it was deleted? (or moderated?)

Anyway, hopefully no one believes his misinformation about what Windows support for multiple displays at different scale plateaus. It absolutely supports rendering things correctly for different displays. I don’t know why he’d want to keep pushing that falsehood. I explained how legacy apps that don’t support moving between displays at different DPI scales and will end up blurry when launched at one DPI scale and moved to a monitor with a different DPI scale, but again that’s an app issue, not an OS issue.

Also it’s preposterous to say that no one cares about Intel graphics running 4K displays. Particularly when this very thread has multiple people talking about using 4K monitors with the Surface Pro 3 (as I’m doing right now). Claiming that you need a dedicated GPU to do work on a 4k monitor is absurd.

I rented a Cinema near my house and connected my 20 nvidia graphics Cards to their cables, so that I can work using the Cinema room wall as my Monitor…

you can not imagine how fun is to Play DOOM or Mortal Kombat 10 in a Setup where the villains are actually bigger than the real ones!

At 30Hz that is going to be quite painful, though. I wouldn’t recommend it.

This is how a Dell UP2715K (5K - 5120x2880) screen looks when it is not the primary display (left) and when it is the primary display (right). In both cases we’re talking about Windows 10, NVidia 355.60 driver and 5K display set at 200% scaling (not that it matters) together with a regular 27" 2560x1440 display set at 100% scaling:
(I can’t attach images, so a click is necessary)

http://imgur.com/84zYYxt

An average viewer should be capable of seeing a difference in font rendering that, unfortunately, applies to all UI elements.
A patched Windows 10 system, running NVidia TitanX with the latest driver is rendering displays with clear difference in regard to which display is set up as primary:

http://imgur.com/sRAiIRh

Unfortunately, a display having priority will be crisp and clear, the rest will be blurred. That is a fact.

P.S. this does not happen on Mac OS X. :slight_smile:

But why would you run it at 30hz?

I run my UP2414Q (MST) at 60hz from my i5 SP3 at work. If you have an SST-only display, you can do at least 50hz from it which should be fine for most uses (though obviously hoping the SP4 can do 60hz on those).

MST (multi stream transport) is a nasty, bug-prone hack of pasting several panels together and pretending they are one contiguous display; I don’t think anyone should buy those kinds of displays, for the reasons @Tonci_Jukic outlined. Lots of bugs, quirks, and questionable future support.

1 Like

Maybe there’s an issue specific to Nvidia? We have setups with a mix of 150% DPI and 100% DPI displays in our office, driven from Intel GPUs on Win10. They have no such problem. Windows runs each display at the correct resolution and DPI scale. You can see apps that support DPI-scale changes (like all modern/store apps) resize as you drag across monitors, snapping to the new size when more than half the window crosses the boundary between monitors (obviously if it’s spanning multiple monitors, half of it will be either too large or too small).

Apps that don’t support dynamic DPI changes will get bitmap scaled when on a different monitor than where they were launched, but there’s not much that can be done about that without app updates.

Sounds like you have a peculiar setup, or a driver problem, or you’re demonstrating an app that doesn’t support dynamic DPI scale changes on a different monitor from where it was launched.

That is NOT true. There are some cheaper panels that work that way, but that’s separate from the MST protocol. Many higher end displays (like the UP2414Q) are one contiguous panel but support MST. MST is just the DisplayPort protocol used to power them. It’s like dual-link DVI in the old days. It uses two parallel channels that each drive half of the display. This is transparent to the OS and the user. It functions exactly the same as an SST display (save for driver bugs like the glitches in Intel’s current Win10 drivers - which is why Intel users will want to use the Win8 driver for now).

The reason for MST in this case is that either:
A) The monitor, while a single panel, doesn’t have ASICs that can handle SST 4K at 60hz.
B) The graphics output of the device (like the Haswell U chip in the SP3) doesn’t have a pixel clock fast enough to support SST at 60hz. But it can handle MST at 60hz just fine.

1 Like

The very same problem happens on XFX Fury X and 15.7 driver (or later, not sure, don’t have it here). The situation was even worse in Windows 8.1 with the taskbar: non-primary display is forced to use scaling of the primary display on the desktop. Windows 10 renders taskbar correctly, bit icons still keep primary-display-scaling on all display so, when 5K is at 200% and primary, other 100% scaled displays keep huge icons, 2x in size. That looks ridiculous:

http://imgur.com/gC4gpqY

What you just described is a different issue. Yes, scaling fails on non-supportive apps when transitioning from display to display, but, I find that a minor problem and, as you say, app problem, not Windows. When I open Skype for Business while having 5K as primary, it is huge if I move it to any other display. But that is also a minor annoyance that can be fixed with an application update.

I find rendering clarity a more serious issue. What is in that screenshot up there is actually a clean start: changing primary setup, log out for changes to apply and run the application again (Firefox, Chrome, Blender, Visual Studio whatever). All applications have the same issue, including Visual Studio.

Try running a more extreme setup, a 200% scale. I presume you can’t see the difference because of a combined effect of a larger dot pitch and smaller scale difference. On a larger scale it is really poking eyes, especially a problem with already a not-so-sharp “normal” display not being primary. Normal displays already have lower sharpness and then this issue.

Yes, someone who uses 2 or 3 identical 4K/5K displays does not really care since scaling is identical. But those who mix do have a problem that is really annoying. It is even easier to make lower-resolution display primary, since a high-resolution one would look less-worse, but that completely beats the purpose of scaling and high-res in the first place.

Well, “except for bugs” implies to me it is not in fact transparent to the OS and the user. Lots of reviews mentioning MST bugs and issues! It’s just not a wise choice at this point in time, e.g.

After a couple display firmware updates by Sharp we were able to manually configure the monitor to run at 60 Hz using the display’s built-in controls (Choose Menu > Setup > DisplayPort STREAM > MST > SET). Basically you had to use Multi-Stream Transport (MST) with DisplayPort to get two 1920 x 2160 images which are then combined to produce the final 3840×2160 image on the display.

AMD, and NVIDIA all were able to support this functionality in their drivers, but there were glitches along the way and they were pretty bad at times. The last thing you want to see when you fire up a game is that only half your display is working.

Better to avoid that madness altogether and pick a monitor (and video card) that properly supports SST / 60Hz at 4K.

What is not true? Can’t comment on that since it is not clear, but:

Every single (and I don’t find Dell 5K cheap to be honest) MST display I had (and I had plenty: UP3214Q, PQ321Q, Sharp and UP2715K now) had same issues. Those issues are handled by firmware and graphics driver. It is a constant battle between driver and display manufacturers who is to blame (has nothing to do with the OS), but problems are real: sleep issues, boot issues, screen artifacts, broken syncs…

http://jmswrnr.com/blog/the-firmware-issues-of-dell-up3214q-and-up2414q-a00-monitors/

Check this out:

:slight_smile:

Except for bugs with the (currently labeled “beta”) driver from one manufacturer for a just-released OS version… And with an easy workaround (use the Win8 driver).

Obviously there have been a lot of issues in the past with 4K, regardless of MST or SST. Thankfully things are a good deal better now, though obviously not perfect yet.

The review you quoted (which is itself over a year old) specifically says that MST worked for them at 60hz, and that some drivers used to have issues. How is that a reason to stay away today?

If your system can support SST at 60hz and the monitor you want also supports it, that’s obviously great. But I think it’s very valuable information to people with popular hardware like the SP3 to know that they can get 60hz over MST monitors very easily, or 50hz over SST monitors with a bit of (unfortunately hacky) work.

Dell’s early revisions of the UP2414Q do have an annoying wake issue, where sometimes after it sleeps you have to power it off and back on (or undock/redock your machine) to get it to wake up again. As far as I understand that’s specific to the monitor though, and everything I’ve found indicates that Dell has fixed that in later revisions of the monitor (though sadly it’s apparently not something you can fix with a software/firmware update). That’s a monitor bug though, not anything specific to the MST protocol.

Otherwise mine’s been working great for over a year, running at 60hz from my i5 Surface Pro 3. The only time I’ve had glitches is when using Intel’s beta Win10 drivers, and I’m hopeful they’ll fix that problem soon. In the meantime, using their Win8 driver avoids the glitches.

Oh, and I thought it was clear from the context, but I was saying it is not true that MST displays are made up of multiple panels. Perhaps some cheap ones work that way, but the good displays do not (and some displays support both MST and SST modes - again with a single physical panel).

Well: they’re not made of multiple physical panels, but the picture is indeed assembled as multiple pictures, two vertical, side by side. All MST monitors work that way.

Unfortunately, it is not an isolated problem with a specific monitor, but all. And happens on Win 8.1 too.

SST never had problems, aside from requirements, of course. But, modern graphics cards (and Intel is quite a crap product GPU to be honest, faked driver features, bad quality drivers…) support all required.

Intel generally has the highest quality drivers and the best GPUs for high-end ultrabooks where gaming isn’t a priority but fast desktop performance and low battery usage are key. If your goal is 3D gaming or other 3D intensive workloads, then yeah, you probably want a discrete graphics chip. If you’re a software developer like me and just want a quality desktop experience with crisp fonts and ample workspace with solid desktop performance and reliability (and battery life), then Intel’s on-chip graphics are generally great and getting better at 3D stuff all the time.

I don’t know why you’re repeating what I said about how MST works, but okay. As I said that’s an implementation detail with no impact to the user. Just like dual-link DVI with high res monitors before DisplayPort came along.

I don’t know where you’re getting the idea that all MST monitors have the wake problem. It’s well-confirmed as fixed in the later UP2414Q/UP3414Q models (and yes, on the original A00 revision, it happens on all OSes, including Mac OS X, as it’s a monitor bug). See posts here from people who exchanged their A00 for an A01:
http://commweb-ps3.us.dell.com/support-forums/peripherals/f/3529/t/19536443?pi23185=33

I’ll skip commenting on Intel driver, too funny.

Where do I get the idea that all MST monitors have the wake problem? Hmm… from experience?
How many 4K or 5K displays you used in the last two years by yourself so you can talk from experience? Because, it would really come as a shock to me that I actually got all broken samples for all the models I had (and I had pretty much all).

I think it is wrong to say that it is without an impact to users. Plethora of issues come from the very core of MST implementation and that directly impacts users. - I’ve never seen a glitch with dual-link DVI. Have you?

And regarding the fix, no it has never been fixed. I’ve used an A02 revision even (my 2x UP3214Q both went for 3 replacements) and it has never been fixed in the end.

The very link you offer up there has a funny comment by a user that got a “fixed” A01 display:

"Well, i received monitor last night, it worked 5-6h(i imidalty swaped to 1.2 dp so i can have 60hz) , than problems started. It started to go to powersaving mode on random … during my work, and during my game , than it shows half screen, etc. I also having problems waking monitor up from sleeping mode ( works only if u restart/shut down pc )… "

Won’t go into operating systems again. I rest my case with a screenshot up there. (Confirmed on AMD cards too.)

Jeff,

I was looking at getting 2 X GeForce GTX 970 over 1 X 980Ti

What are your thoughts on that as you alluded to problems with multi GPU on last build.

Justin

Probably fine, my multi GPU setups were quite stable. However, there is always more complexity, more power draw, and according to Tech Report more variability in game frame rate with two GPUs.

Both Nvidia and AMD have worked on the multi GPU frame rate variability problem a fair bit since it was publicized in 2012 / 2013. So it might not be a huge issue today.

In general it is better to get a single fast card rather than two slower cards, though. Might be more expensive if that forces you into the super premium enthusiast high end cards, though, so it is a reasonable trade off to make.

The GPU guys must be super excited about 4K since it is a very legit reason to need even more obscene GPU horsepower than what we have today. If displays were stuck at 1080p you could basically stop buying new GPUs from this day on, forever, and be fine.