Where Are The High Resolution Displays?

If you allow for laptop screens, that 100 DPI isn’t really the top of the consumer monitor spectrum.

I’m the proud owner (sort of) of a Dell Latitude D820 with a 15.4 inch WUXGA display. That’s 1900x1200, and 150 DPI for those playing along at home. I bought it almost exactly a year ago, and at the time, Dell was the only company I could find selling WUXGA laptops at 15.4 inches.

Now with a year under my belt on this bad boy, I know why they’re so hard to find. Everything is absolutely tiny. If I slouch back in my chair, I can’t read anything. Smaller images on websites essentially become thumbnails. (Thank goodness for the Firefox Mouse Gestures plugin that lets me scale up images with a quick mouse stroke.) And the natural dimness of laptop displays really doesn’t help things.

Yes, I could turn up the DPI settings in Windows, but lets face it, most apps don’t do very well, aesthetically, if you mess with the DPI settings. Even some of Microsoft’s own.

Anyway, if I had the choice to make again, I’d go with a 14.1" SXGA+ (1280x1024, 116 DPI) screen, for two big reasons. For one, everything would be a little easier to see. For two, the widescreen format just isn’t good for editing code or documents. Better to just go with letterbox and get multiplier monitors. If you’re someone that needs widescreen though, you might be better off with the more modest WSXGA+ (1680x1050, 128 DPI), or bumping up to at “desktop replacement” 17-incher for WUXGA (133 DPI).

(BTW, for anyone who wants to dismiss my complaints and is interested in getting one of their own, Dell discontinued this screen format for a while, but you can get it again now on the Latitude D830.)

Interesting, because I’ve seen people use 37" HDTVs as computer monitors.

Drool I would love to have my computer hooked up to this bad boy: http://www.samsung.com/Products/TV/DLPTV/HLT6189SXXAA.asp. All my multimedia and computing needs served to me right in the comfortable butt-groove of my couch. =)

If you allow for laptop screens, that 100 DPI isn’t really the top of the consumer monitor spectrum.

Doh! I guess on my first read-through I missed the blockquote in your post that said essentially exactly this.

Hopefully displays never do become any more high resolution. 90% of users simply browse the web, watch (poor quality) web videos, and already their monitors are left on wasting energy. In order to have inexpensive high resolution monitors available, it would have to be affordable to the masses so that mass production would be profitable. Seeing as how Microsoft loves chewing up resources, I can see them running out of pixels in the not-to-distant future. Do users really need seamless text with detail only visible under magnifying glasses? I am more excited to see high contrast (100000:1) SED displays, and I anxiously wait to be dazzled by them.

http://en.wikipedia.org/wiki/Surface-conduction_electron-emitter_display

Floppy disks haven’t improved either. I don’t know the exact year, but once we reached 1.44/2.88, development complete stopped (in favor of CD, DVD, zipdisk, etc). Still, Most PCs I see have a floppy drive (aka existing PCs, not new ones) and they pretty much use the same hardware that we had decade(s) ago.

You said it yourself - laptops.

More and more people are moving to laptops. Especially laypeople who don’t really need the grunt of a huge desktop computer any more. As laptops take over you’re going to see average DPI go up on monitors by simple virtue of the fact that most computers will be laptops.

Lots of things to comment on. I just got a 24", 1920x1200. I don’t really want anything physically larger than this, so the only way anyone’s going to sell me a new monitor is if this breaks or they offer higher DPI. I want higher DPI - after all, that’s why it’s more pleasant to read text on paper. I know some people will always want larger displays, but I think when 24"ers come down in price to say $300-400, higher DPI might be the next step.

Games: double the DPI. This shouldn’t be a big deal - set the game at half the native res and the monitor should have an easy job of scaling. It should have the same quality as a display with half the DPI. Or the video card could scale - this shouldn’t be much work compared to the other rendering the card has to do.

Programming on wide-screen: turn it sideways - now you have 1200x1920. I actually find x1200 to be enough - I like the extra space to the sides for icons or docks.

Effective dpi. Perhaps it makes more sense to measure the eye-to-display distance, and compare effective resolution in pixels per degree.

Windows has “large fonts” and screen resolution settings. OS X Leopard will have a resolution-independent interface. Soon it will be convenient to sit as close or as far from your display as you are comfortable.

And if you have been looking at computer displays since 1984, I bet you are sitting farther back, so your effective resolution is increasing at better than industry standard. Cheers.

I will mention yet again, laptops. I have a 1920x1200 display on my notebook, and absolutely love it. This comes out to something like 147 dpi.

Additionally, I have an external older CRT monitor at 1600x1200. I love all of the real estate, but could definitely use more.

I will be attending RPI in the fall, and my only concern about the laptop that they will have for the students to buy is that it will not be high resolution, i.e. 1920x1200 (or better). (They haven’t yet released the specs for it.)

Just cound how much memory you’ll need and how much cpu cycles will be wasted.

Althrough solution to this is clear - send images compressed and letters as vector data to screen. Some tricks will be needed to make vector data rasterisation more computationaly easy but for characters this could work well i think.

By the way as far as i could remember there was a solution in crt world to use horizontal line luminophores instead of dots. So you get infinite horizontal resolution.

Amen, someone who does care about resolution!

Last year, when buying an LCD monitor, I had basically two options (I didn’t want to go over expensive widescreen stuff): a 17" running at 1280x1024 or a 19" running at 1280x1024. I said “pay more for the same screen space, actually resulting in an inferior resolution?” and bought the 17" one. I actually prefer my pixels smaller and setting my OS to a large DPI setting (making text bigger and subpixel anti-aliasing really smooth), but sadly very few people realize this.

Now, about Dave Shea’s comment and ClearType vs Apple’s Display PDF (or whatever their text rendering engine is called or is part of): he’s right that on higher resolutions you don’t need so much to align to the pixel grid. BUT fonts only set hinting to small sizes! As your resolution gets better, your fonts will naturally get bigger, and ClearType’s hinting and aligning will be greatly reduced, making it closer to Apple’s rendering (while still being arguably a tiny bit clearer to read).

I’m guessing that by the time we get to 200 dpi, the graphics hardware will be phenomenal, and we’ll still be doing anti-aliasing and ClearType because why not? The technology is there, we’ve got cycles to burn, and it makes it just that much better. Now if DPI suddenly jumped up without a corresponding boost in GPU power, I can understand if we dropped anti-aliasing and ClearType for a few years.

“1920x1200 doesnt really help me see more code - really not any better thant the 1600x1200 of most 4:3 19 or 20” displays. developers need more vertical for editing code."

Am I the only one who keeps several emacs buffers open side-by-side while coding? I can easily fit two files on screen at the same time, by splitting the window vertically into two tall views (or as emacs calls them, “frames”), even on my small laptop display. “C-x 5 2” is your friend. :slight_smile:

I just wanted to tell everyone that the T221 monitor is really great; the 200dpi in a 22 inch monitor is really wonderful. You can barely see the pixels and the rendering is really very smooth.

I don’t mind that the UI does not scale, as I have good eyes. Having a 3840x2400 pixel desktop is really a marvelous experience.

For the curious, there is an active group on yahoo (http://tech.groups.yahoo.com/group/IBM_T2X_LCD/messages?o=1) dedicated to the T221 family of displays.

Sometimes, you’ll find one of the T221 variants or maybe a Viewsonic VP2290b appear on e-bay. Sadly, these displays are no longer manufactured. IBM stopped selling them through their official channels in early 2006, which is a shame for such a wonderful piece of technology. But I understand that having to pay about $8000 for a monitor is a lot of money and that there never was much demand for it. Vista should change this, but people are so used to large pixels that I doubt that there would be enough consumers ready to invest in high DPI displays.

But imagine having 9 million pixels with zero default! It’s like having a Toshiba Libretto with a 22 inch display :slight_smile:

Pierre (author of www.creativedocs.net)

I remember distinctly (though without evidence) that the original Macintosh screen was 72 DPI.

Doubling the DPI will quadrouple the number of pixels which I think should be the factor to look at if you are trying to make a “Moore’s law” comparison. But then “Moore’s law” was about doubling the numbers of transistors at the same price (if I remember correct). Monitors isn’t really about more pixels but also about the “quality” of pixels too. Update rates, contrast, brightness, colour space etc

The current OS X, 10.4 Tiger, has UI scaling built in, but no general user interface for it. It can be set system-wide or for individual applications through the command line or with the Developer Tools.

Ars Technica: Mac OS X Tiger; Scalable User Interface
http://arstechnica.com/reviews/os/macosx-10-4.ars/20#scalable-ui

MacWorld: Play with GUI scaling
http://www.macworld.com/weblogs/macosxhints/2006/08/guiscale/index.php

@Chris Blow
It’s particularly ironic that the DPI Wikipedia article that Jeff linked to clearly has a section entitled “Misuses of DPI measurement” and talks about how people mistakenly apply the term to refer to monitors, instead of using the proper term (which Chris and others have pointed out to be PPI).

“The keyboard? :)”

I’d say this one has actually gone backwards.

Agreed. You also used to be able to find mechanical keyboards that did the job right.

Now turn that awful music down…

Another issue is that a bid display + high DPI = hell-uv-a-lot pixels.
Imagine fast operations your OS does will now be noticably slow.
Moving a window around the screen, redrawing it, fading the screen etc. will need 10 times more processing (and more video memory too).

BTW. I watched a video on Channel9 on ClearType. Microsoft have invested millions in the technology, and have hired an independant benchmarking organization which found serious productivity gains reading ClearTyped text.

I am amazed that some people do everything and reason like crazy to defend OSX’s superiority in all aspects to Windows. Don’t forget that Apple has a smaller OS budget and it is understandable that they need to ‘cut corners’ here and there.

100 DPI ought to be enough for anybody.