The best screen resolution I’ve ever had was from my palm 320X240 pixels screen.
So, there’s broad agreement that with much software setting large fonts and large icons just doesn’t work. And broad agreement about small print and layout issues with websites.
What about those of us with LOW resolution vision? I don’t know if high res monitors will help fix these things, or make it much worse.
Do head-mounted displays offer the possibility of high res over an apparently large area? I don’t know where to go to find out about those
displays, price, availability, performance…
These discussions are a bit frustrating. Even though Joel Spolsky pointed out that different is likely to look wrong when it’s new, many of the participants aren’t getting that this might apply to them.
I see a lot of long-time Windows users having a glance at Safari for the first time, and declaring that Mac vs Windows type rendering equals accurate representation of typefaces vs better readability. A discussion about different design philosophies ensues.
This is a strawman position, because I haven’t seen a shred of evidence that ClearType is more readable than Quartz text. We can’t even realistically compare the two systems, based on a single pair of screenshots of a Google page set in Microsoft’s bastard Arial font. Hinting is a feature of some fonts and not others, and varies in quality, and OS X does make some use of it.
To compare the two font engines, we need to see a variety of screenshot comparisons, with a variety of fonts.
And as David Shea pointed out,
“Marginalizing type designers is a pretty poor way to make any sort of point about typography, given that entire careers are based on an understanding of legibility and facilitating ease of reading. A statement like that one almost veers into dangerous “programmers knowing better than experts in their respective fields” territory, which I can’t imagine was his goal.”
Readability is also affected a great deal by the selection and quality of fonts, as well as the line length and leading the text is set on—this is true for both print and screen. An evaluation of readability should consider at least some samples set by professional designers experienced in working with type.
Who says that more exaggerated hinting improves readability better than proper use of typefaces? Who has demonstrated that ClearType’s readability is not worse than Quartz’s?
My 12" dell with 1280x1024 yeilds a DPI of 136.6. Honestly, with subpixel rendering I don’t think a higher DPI would be very beneficial to me.
The Neo1973 phone from FIC will have a 2.8" diagonal (43mm x 58mm) 480x640 LCD screen. This corresponds to about 283 DPI and the developers who already have it says it’s gorgeous. (More at http://wiki.openmoko.org)
I think I already have a 200 DPI display on my desk. My Dell Axim x51v has a 3.7 inch VGA display. Using the same calculator, it yields 216.2 DPI. And it looks great, it’s almost impossible to see the pixels. The fonts, etc., are adjusted so that everything is readable. This PDA was a little more expensive but still affordable to a fair number of consumers (IMHO).
I’m still waiting for my desktop version.
When comparing DPI you have to take into account more than just the stated resolution and the screen size. A display rated at 1920 pixels wide may not actually have that many physical pixels available. Until recently most “high res” plasmas could display 1920 but had a native resolution of only 1336. Check the specs first – pixel pitch tells you as much as a ‘resolution’ number.
The benefit of a high DPI monitor is that you can fit more stuff on the same space, the same way you can put much, much more readable text on a piece of paper compared to a monitor of the same size.
It is refreshing to see concern about the quality of the graphics.
The story is incomplete without looking at COLOR.
Yes, the early machines might have been able to achieve some reasonably high resolutions, but only by abandoning colour. If you wanted 256 colours you had to drop the resolution.
You are not comparing like with like.
@ Mike Johnson
The white space is from a website designed for the majority of the www users that don’t have large (15") high-res (800x600) displays.
While bigger and better displays may be “standard” in the US, they are not necessarily worldwide.
I appreciate Jeff’s KISS blog design, as well as the content.
The post needs mention of th iPhone. We’re up to 160DPI.
I don’t understand why, after a certain screen size(apple cinema hd), monitors drop in resolution to 1080p or 1080i. The price of a dpi gain is unnecessarily exponential.
I’m a firm believer in higher resolution displays, they offer more room to do things. My friend and i were always dreaming of tearing apart 4 19" 1280x1024 lcd monitors and making the gap between them as small as possible to emulate a super high resolution display(5120 x 4096). I believe this is possible with a crossfire or sli setup, correct me if I am wrong.
iAt 100dpi, ClearType wins out,/i
Whereas, at 80dpi, ClearType looks like I’m drunk.
Even when bigger displays are standard, most people don’t modify their resolution, and many people still are running CRTs, meaning that they’re running at the default resolution of Windows (800x600 if they’re running XP, 640x480 before that). For many people with vision problems it’s really not acceptable to run at higher resolutions without scaling the font size up, and many people have already noted the problems with that.
One of the things I love about running at high resolutions on wide screen monitors is opening documents side-by-side, or using the 2-page reading view in Word.
Someone previously mentioned HD movies, but realistically we’re already beyond the realm of HD when we’re on a computer. A higher DPI would mean that the HD movies would be smaller on the screen if they’re run in their native resolution than they are now, and most current computer monitors can run at a higher resolution than HD anyway (1080p? Let’s see 1600p).
One of the issues with current gaming performance has to do with antialiasing anyway. As cards became capable of acceptable performance at the high end of monitor resolutions, and manufacturers realized that most of their users were running games at low resolutions (1024x768 or 800x600 was common at the time, and some people used to run even lower resolutions to squeeze out maximum performance), they started using FSAA methods that often render at sub-pixel levels to improve the quality of the images on the screen. In other words, graphics card manufacturers have been working in a 200dpi (or 400, sometimes more) world and sending 100dpi to the screen for some time now, and you could easily disable the FSAA to get better performance on a higher resolution screen. The need for higher end video cards is primarily driven by the desire to use all possible features of any given game. None of my systems have top-of-the-line video cards, and I haven’t had much trouble running games in quite a while unless I had GPU-intensive features enabled (that usually can be disabled).
One Laptop Per Child (OLPC) has a 200DPI display. And the machine clocks in at $150 at the moment, so the technology can’t be that extortionate.
They says screen specs are:
Viewing area: 152.4 mm 114.3 mm (7.5" diagonal)
Resolution: 1200 (H) 900 (V) resolution (200 DPI)
Technically, a 15" monitor is a quarter of a 30" monitor, not a half.
I have a Viewsonic VP2290b, the 200dpi monster that used to sell for $10k. They can be had on ebay cheaper than that now (still more than a New Dell 30").
It has been difficult to live with. I’m tied to a particular video card now that has dual output dual-link DVI, and I still only get 30Hz refresh (which is actually just fine with this screen). Fonts are a big problem - all the fonts that work well on 75dpi screens tend to suck on 200dpi. It draws 150 watts and has a fan inside it. Unresized 10 megapixel photos are eye-opening, the sharpness of consumer LCDs is gone but is replaced by a depth of fine detail normally only realised when viewing a professionally print on photo paper.
But it sure has its advantages when coding. I fit three 132x132 terminal windows side by side. A 1280x1024 browser window only takes up one corner of the display. I can work on 5 source code files, have 3 man pages open, and an SQL console, with each window being full-sized and capable of doing normal work.
Apparently they aren’t making this displays any more. Not only are the expensive to make, but not very popular with the majority of people who have to squint to read the small fonts it is capable of displaying. 200dpi was too big a jump; there needs to be organic growth in resolution so that operating systems have a chance to catch up.
“100 DPI ought to be enough for anybody.
Haacked on June 15, 2007 02:33 AM”
Update your .sigs now, people…
Vizeroth wrote, “Someone previously mentioned HD movies, but realistically we’re already beyond the realm of HD when we’re on a computer. […] most current computer monitors can run at a higher resolution than HD anyway (1080p? Let’s see 1600p).”
I think you’re mistaken about how HDTV resolutions are denoted. The number is the vertical resolution, so 1080p is actually 1920 x 1080, a resolution that only a handful of “current computer monitors” support.
Aren’t we talking about PPI and not DPI?