Where Are The High Resolution Displays?

In a recent post, Dave Shea documented his love/hate relationship with the pixel grid:


This is a companion discussion topic for the original blog entry at: http://www.codinghorror.com/blog/2007/06/where-are-the-high-resolution-displays.html

What is the big deal about DPI?

Don’t get me wrong, I can definitely see the benefits when it comes to manipulating images or watching something like an HD DVD, but what about the majority of users that simply set down at a computer to browse the web, type a paper, view an Excel spreadsheet, etc? You know that “average user”. What will they do if this becomes standard?

What about the normal “power users” of which I group in this category programmers? Is our code supposed to look better at this resolution?

In the end, I totally agree with Jeff here. I do not see ultra-high DPI monitors becoming any form of common any time soon.

“I can’t think of any other piece of computer hardware that has improved so little since 1984.”

The keyboard? :slight_smile:

All of the things you mentioned will look better and be easier on the eyes. ClearType and that thing Apple do is far from perfect; things would look a lot better to read if the DPI was increased. Anything reading should be easier on the eyes; anything just looking should look nicer, too.

This is a real shame, but the advantage to raising the pixel density is clear - we can remove all the workarounds to low-resolution displays. MS Cleartype vs Apple being the current one, but anti-aliasing in general is a workaround the fact that the smallest display element, the pixel, can easily be resolved by the human eye.

Sitting in front of my 1280x800 laptop screen (13") and my 1280x1024 LCD (19"), the laptop gives a much clearer picture. Scale that up to 19", and I’d either have more pixels to work with, or screen elements of the same size that just looked much sharper… One of my big hopes would be that people would increase their font sizes and save their eyes :slight_smile:

users who’ll benefit most with super high dpi monitors are graphics designer/photographers
With DigiCam’s Mega Pixel going higher and higher, viewing a picture at 100% w/o scrolling is impossible, even with the apple 30" screen. If you’ve seen digital pictures on one of those high dpi screens you’ll notice the difference and would dream you’ve one of them at home :slight_smile:

How many dpi can we actually see? check this page out
http://www.blaha.net/Main%20Visual%20Acuity.htm
basically it depends on the viewing distance, the closer you’re to the screen the higher the dpi you’ll need. The bigger the monitor, the further you sit, hence higher dpi is needed to see a crisp, smooth image.

I’d hope that they can stuff 2560*1600 on a 22" LCD, that would be good enough for me :slight_smile:

Interesting point. In some regards we have even gone backwards. 5 years ago I purchased a mid range laptop with a 15.4"/1920 x 1200 screen (on sale) for exactly $1,100. I’ve wanted to replace it for a while now but but every other screen I try out seems hopelessly clunky in comparision. Right now there is only 1 machine on newegg with this combination and it is $2,000. (How did it go up in price?) To get 1920 on a Mac you have to spend $3,000 for the 17" which I’d never spend on one machine. You can argue that 17" is a better size for it but I like lesser weight and size of 15.4" and I’m sorry to see that it didn’t become the standard.

“The keyboard? :)”

I’d say this one has actually gone backwards. Used to be, you could find completely standard keyboards all day for $10. Nowadays, you have to scour far and wide to find a keyboard that doesn’t have some bizarre cursor keypad rearrangement abomination, or uses the small backspace key, or sticks the “sleep, power, hibernate” buttons right next to the cursor keys, or throws a ton of useless “function buttons” at you, or…

Now get off my lawn.

Where Are The High Resolution Displays?

At Apple:

  • New MacBook Pro, 133 DPI
  • iPhone, 160 DPI

Jeff Atwood wrote:
“the resolution of computer displays has increased by less than a factor of two over the last thirty years”

1984 was way less than thirty years ago (I don’t have the equipment to do the exact calculation right now).

The Apple II in 1978 had a resolution of 290x192 on a regular television. On a typical 15" television of the day, that would be about 20 dpi. That was less than thirty years ago also.

I’m all over higher DPI. However, what I am NOT all over is the hurt it is going to put on my system for gaming.

Already takes a 400+ dollar GPU to run on my display with all the eye candy, and even then I see fps dropping as low as 20 even on games that are a couple years old.

A resolution of 3800x2400 makes any current gaming PC cry… even the very top end.

Sure, AA won’t matter as much when you have that much pixel density, but even without the burden of anti-aliasing, I’m sure we’d need much more advanced hardware to get playable framerates on games.

One thing that no one has mentioned yet: Prior to Vista, Windows wasn’t exactly friendly to high DPI monitors. Let’s face it, there is only so small a character can physically be before you can’t see it. Since Windows was essentially pixel-based in all things (dialog box positioning, font sizing, etc), working with a display of significantly higher DPI than the average means that you are going to have to deal, on a daily basis, with all the software out there that doesn’t do sizing correctly.

Or go turn on ‘Large Fonts’ or ‘Extra Large Fonts’ in the Appearance tab on the desktop. Things work, but there’s plenty of software where stuff will be out of position, dialogs will look funny, buttons and fields will have chopped off text, etc.

Heck, just go around to some bad websites where the nitwit specified his font or table or frame in pixels. It’s bad enough on a normal monitor - what will it look like on a 200 dpi one?

Until Vista takes hold and most software starts to use the APIs correctly enough, we’re not going to see really high res monitors, because it will be too painful to use them, day to day.

I think one good point would be the fact in conventional operating systems everything gets small with a higher resolution. Many non-power users do not want everything to be where they can barely read this. OS 10.5 should fix this with their resolution independence. I do not know much about vista put I did notice an “increase font size” option that changed its dpi setting so that things became larger to allow for larger resolutions.

Hmm, thats I what I get for not reloading my page before posting. Sorry for the similar comment.

To argue the other side: desktop display sizes are approaching the limit of what people want on their desk. (Ok, I’ll admit, I’d still like a giant monitor. But for your average man-on-the-street, the Mac 30" display is TOO big.) It’s not as hard a limit as in laptops, but there’s definitely some back pressure there.

Something I found out when researching High-DPI screens:

For a given technology, higher-DPI screens are typically dimmer than lower-DPI screens. This is because LCD-based displays have a matrix of thin black lines that define the edges of the pixels. (Use a magnifying glass to look at your LCD to see the matrix.) When you increase the DPI, the matrix lines don’t get any thinner, so the total area not covered by the matrix lines decreases, causing less light to pass through the LCD.

You can compensate for this by increasing the intensity of the back-light, but that wastes more power and generates more heat.

1984 was way less than thirty years ago (I don’t have the equipment to do the exact calculation right now).

Heh. :slight_smile: Corrected.

for your average man-on-the-street, the Mac 30" display is TOO big

Interesting, because I’ve seen people use 37" HDTVs as computer monitors…

http://www.codinghorror.com/blog/archives/000756.html

Comparing an original MacIntosh with today’s screens is a bit misleading as that used a CRT while today’s computers almost always use an LCD. My office replaced all its CRTs with LCDs a year or so ago, and it was definitely a step backwards. The equivalently priced and sized LCD to our old 1600x1200 CRTs only do 1280x1024. So yes, there’s been a bit of a move backwards as the LCD replaced the CRT. If LCDs had never been invented, than we’d probably all be looking at screens with a higher DPI.

Plus, Dell has sold 130+ DPI laptops for a while now.

I’m surprised you didn’t mention gaming, the one force that actually keeps the DPI from going up.

DPI is measured only in one direction. So when the DPI doubles, the number of pixels that have to be addressed is squared.

Simply rendering at a lower resolution and then stretching the image clearly defeats the point. So faster graphics hardware is the only alternative.

So even if you get your hands on a nice 30" 200DPI display (with a resolution of say, 5120 x 3200. Then we’re talking about more than 10 times as many pixels as with a normal 1280x1024 (common for 19" TFT).

When our graphics cards have no trouble dealing with such resolutions, then the DPI may increase. But currently, large screens with low DPI is the only possibility with the slow graphics hardware we have.

I just want it for my tired old eyes. I program for hours and hours each day. The company just sent out an ergonomic questionnaire. One of the questions was “Do you spend four hours or less per day working on the computer?”. Correct answer…"Ha!"
I’ve tweaked as best as I can and my 22 inch is pretty sharp, but with that amount of time on the screen, I would LOVE to have it easier on the eyeballs.
And as a non-game programmer (and pretty much a non-gamer) the horsepower needed to drive it isn’t the problem. It’s just the lack of availability…oh…and the cost if it were.