LCD Progress

After revisiting my ongoing three monitor obsession recently, I was compelled to upgrade my current mongrel mix of varying LCD monitor brands and sizes. I settled on three 20" Samsung 204B panels.


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2006/12/lcd-progress.html

Which video card(s) are you using to get all three monitors hooked up with DVI?

At work:

GeForce FX5200 PCI
GeForce 7600GT PCIe

At home:

ATI X1900XTX PCIe
ATI X600 PCIe

It’s generally a good idea to use cards from the same vendor so you can install one video driver instead of two. Windows Vista requires this to get the hardware accelerated GUI across all three monitors.

Also, if you don’t want to futz around with multiple video cards, you can try the Matrox TripleHead2Go:

http://www.codinghorror.com/blog/archives/000740.html

To clean my LCD’s of dust, I just use one of those disposable electrostatic dust cloth thingees. One lasts a long time, and sits quietly in a drawer until I need it. A light wipe, and I’m immediately done – no drying time, no residue.

I’m partial to the Pledge brand, perhaps foolishly thinking that Johnson is slightly less evil than most of the other players in the market.

Make sure you get a flavor that does not contain a polish, scent or whatever that would leave a residue.

I have a Radeon X700 and two NEC 70GX2 panels. I run one panel from DVI and one from VGA (that’s all it has). I can’t tell the difference in the image quality on those panels - and I’m very picky. I used to believe the DVI is better myth, but I have one anecdotal example sitting on my desk that says otherwise.

YMMV

-John

What a decade, by 2010 we should’ve successfully replaced all bulky CRT’s with LCD’s …

By 2010 I hope we have SED displays if they’re as good and as cheap to produce as they supposedly are. I’m a stickler for what CRT color quality was compared to LCD and can’t wait for the technology (SED) with the advantages of both.

  • Note: It’s worth mentioning I use 20" LCDs all day at work and at home (general, games, and movies) and I quite like them, but with a 19" CAD-quality CRT as a secondary (soon to be tertiary I hope) display at home. I just want to represent all fronts.

Great artile. Why not run an LCD monitor at 75 Hz?

In short because refresh rate has to do with the electron gun in a CRT; LCDs have no electron gun and therefore no refresh rate.

http://en.wikipedia.org/wiki/Refresh_rate

In a CRT, the scan rate is controlled by the vertical sync signal generated by the video controller, ordering the monitor to position the electron gun at the upper left corner of the raster, ready to paint another frame. It is limited by the monitor’s maximum horizontal scan rate and the resolution, since higher resolution means more scan lines. Increasing the refresh rate decreases flickering, reducing eye strain.

Much of the discussion of refresh rate does not apply to LCD monitors [because they have no electron gun]. A phosphor on a CRT will begin to dim as soon as the electron beam passes over it. LCD cells open to pass a continuous stream of light, and do not dim until instructed to produce a darker color.

LCD’s do have refresh rate. To not get too confused, you might call it update rate. It’s the rate that new pixel values are sent to the screen so it can update the display. (You still have to wait for the liquid crystals to respond) If the monitor is on a digital output you may crank your refresh rate as high as your monitor supports - in theory everything is smoother. In practise it’s hardly noticeable, especially if you’re not a gamer. On analog output’s it may be better to use a lower refresh rate to get a crisper signal. But again, if your equipment and cables are good quality, the difference isn’t all that noticeable.

Some other comments:
Be very sceptical about stated response time, viewing angle and contrast numbers. There is no established standard method to measure them, so the may be off by quite a bit. For example viewing angle is for some manufacturers the angle where the monitor can maintain at least 1:10 contrast, for others it’s 1:5. In reality, if you care about colour integrity, the minimum should be something like 1:50 or 1:100. Response times are usually black-white-black transition lengths with transitions starting and ending at 5/95% or 10/90% luminosity (depends on who’s measuring). Transitions from light-grey to dark-grey can take several times longer.

Your assumption about most displays supporting 24bit colours is wrong. Most displays sold today are Twisted-Nematic panels, with extremely fast black-white-black response rates. Unfortunately fast TN panels ususally have only 6bit per colour channel accuracy and extremely poor vertical viewing angles. Even though the colour inaccuracy is somewhat masked through temporal dithering, if you move your eyes at the right speed you can see pretty annoying dithering. And don’t get me started on the viewing angles, if you need to do any colour matching, TN displays are basically useless. Most TN displays have noticeable colour changes over the height of the display thanks to changing viewing angle.

I find panels using SIPS and PVA technology a lot better compromise. They have usable viewing angles and 8bit colour channels. In last couple of years the response times have also gotten acceptable to anyone but the hardcore gamers. Unfortunately most manufacturers don’t bother to specify the technology used. Nevertheless, you can probably identify SIPS and PVA panels by their ridiculously high (due to the useless measuring standard) viewing angles of 176 or 178 degrees.

I just spotted that Samsung monitor the other night and finally think I’ve found an LCD worthy of me. (I’m terribly picky, and colors and refresh rates are really bothersome. Most LCDs just don’t look good enough.)

Now I want one.

Are you using all three monitors through DVI?

If the human eye is capable of distinguishing about one million* colors, what’s the point of worrying about whether a monitor can display 16.7 million different hues? Isn’t that just an excuse to sell bigger disks and scanners that can’t be differentiated on actual, you know, features?

Admittedly, I’m a programmer, not a graphics and color-theory geek, so maybe I’m missing something. But obsessing about color depths beyond 24 bits strikes me as one of those “if Superman raced the Flash…” pursuits.

I don’t care if my monitor has one million, 16.7 million or 24 zillion colours. But I do care that I see clearly visible dithering and/or banding with 6bit displays. 7bit is not as annoying, but you do need 8bits per colour channel and a reasonable gamma curve to not see noticeable artifacts. Actually you need lots more than that if you take account the full dynamic range of the human eye But current displays can’t display 1:1000000 contrast yet, so for now 8 bits is enough.

Are you using all three monitors through DVI?

Yes.

Admittedly, I’m a programmer, not a graphics and color-theory geek,

If your job title is “graphics designer”, color matching is a critical and essential part of your job. But you’re right that for most people, the current level of LCD color fidelity is good enough.

What a decade, by 2010 we should’ve successfully replaced all bulky CRT’s with LCD’s, many mechanical HD’s with solid state and burning hot single-cores with moderately warm quad-cores. The BIOS (or replacement for it) should support virtualization such that we will be able to use whatever operating system seamlessly next to each other, there will be no cables due to wireless USB and… and… ok I’ll stop.

I used to believe the DVI is better myth

The analog conversion circuitry in your average LCD is fairly good by now, but DVI is still the way to go. Try the monitor test application I linked and do a side-by side comparison of analog to digital. I think you’ll be surprised.

a href="http://www.passmark.com/products/monitortest.htm"http://www.passmark.com/products/monitortest.htm/a

The last time I tried this was with the two 19" Rosewill panels (at the time, my secondary video card had only analog VGA out), and I could easily tell which was which by eyeballing it.

Which video card(s) are you using to get all three monitors hooked up with DVI?

“If the human eye is capable of distinguishing about one million* colors”

My color theory prof. told me that in addtion to that, the million colors that we can see is not a subsest of the 16 million that the monitor displays. It is an intersecting set.
Apparently there are colors that no additive-model display technology will be able to faithfully reproduce.

I always research monitors on tomshardware when I can, because they’re the only ones I know of that take a rigorous, scientific approach measuring all aspects of LCD monitors. That way when you’re dithering between five with similar specs, you can quickly weed out two or three that are junk.

Of course, with the huge proliferation of LCDs it’s impossible to test more than a fraction of them; they concentrate mainly on high-end, high-quality ones. (Or at least those that claim to be such.) I just have to hope that their close cousins are similar and just slightly worse; in my case VG930 instead of VP930.

I’m a Viewsonic bigot, myself. :wink:

HD15 (analog) definitely doesn’t have the color exactness of DVI, but you’d only notice if you push pixels for fun and profit. When I switched, I had to swap back and forth in photoshop a few times to find any difference.

How do you orient your 3 20" LCDs? All in landscape mode?

All landscape. Be very careful with portrait mode; rotating the display also rotates the RGB pixel matrix, which tends to break ClearType. Or at least the 3x resolution improvement is now vertical and not horizontal. Not sure if Vista can deal with this or not, but I doubt it.

There’s more (somewhat dated, but accurate) info on the portrait vs. landscape ClearType issue here:

http://www.brandonfurtwangler.com/?p=54

Ian Griffiths entered a comment in that thread, I wonder if he’d want to follow up on that here…