a companion discussion area for blog.codinghorror.com

Welcome to the Post PC Era



The trouble with high resolution PC displays is that Windows 7 and earlier doesn’t scale up well. You can increase the font size, but that doesn’t make everything scale up, so some things you need to click on are very small. Also, not everyone tests their applications with different font sizes, so you get strange effects with bigger fonts.

Well-behaved Windows applications should scale up all user interface elements, and in general the OS itself has been pretty good at doing this since Windows Vista. Regrettably, making programs that look good at multiple DPIs is harder than it should be: try getting pixel-perfect bitmap images in a WPF application at multiple DPIs, for example. WinRT at least provides built-in support for multiple-DPI bitmap images, but until that framework is made to work on the desktop, it’s useless for many classes of programs.

As for being in a post-PC era, can we keep the high-resolution screens and leave the walled gardens?


Things have been improving, bit by little bit. I bought a 27" monitor in 2007 with a 1920x1200 resolution, and in 2011 the best available at that size was upgraded to 2560x1440 - 30% more in each direction. I’m thinking about it. Compared to the doubled Apple resolution, it’s not much, but it’s not like manufacturers are completely forgetting that aspect.

As an aside, reading text for an hour on the iPad 2 hurt my eyes. Reading on an iPad 3 does not, as long as the room is adequately bright. It does make a big difference in how much you have to squint.


Heh, not quite. That was written in 1997, and “in ten years” would have been 2007. Only now in 2012 are we getting the very first truly hi-res screen mass produced screen. But it’s attached to this Post PC tablet, you see…

I bought a Toshiba u820 in 2008, so he was pretty much spot on.

It sucked! Windows isn’t really built for being used on high dpi-devices (try setting the dpi to anything but the default and be prepared for a world of hurt) and Toshiba wasn’t/isn’t really prepared to make a high profile product with ubuntu (or another distro).


@A Facebook User Thank you, thank you so much for pointing that out!!!
Better display is not computing. Didn’t we have fun back in the day with our Amiga’s, x486 CPU’s? We sure did! Jeff, I’m sorry to say that I’m disappointed in your post. Perhaps you wrote it just to gain popularity since the new iPad keywords are extremely popular these days? :S


Congratulations on having good near vision. For the farsighted folks out there, of which there are many, reading anything up close can be considerable challenge which no level of pixel density will help.
I’m blessed/cursed with mild nearsightedness and I do miss the pixel density of my iPhone4- that’s about all I miss about it.
Saying those that complain that “all they did was upgrade the screen” are bordering on stupidity is harsh and subjective. What one person considers a must-have feature, the next can hardly be bothered by. And then there’s the fact that not everyone is a gadget junkie. And on the other end of the sprectrum are the technophobes that are scared of specs.
For many, a device with “retina display” is something they can’t do without because Apple has spent over a billion marketing dollars to convince of such. When it comes down to it, many can’t tell the difference: http://www.pcmag.com/article2/0,2817,2401726,00.asp
The placebo effect is a real thing.


Smartphones are faster and have more memory than mainframes of the 1980s. Everyone now has the problem of figuring out what to do with them. The computer industry has the problem of figuring out how to make money with them or off the users.

How many useless variations in operating systems do we need? How often do we need to upgrade? We are supposed to buy a computer because it is thin? LOL I would rather have a thicker computer that would run for 24 hours.


(from twitter) @clipperhouse: I wish there was a product that combined an iPad, a keyboard, and a some way to prop up a screen


Neither Windows nor Mac OS X scale that well - too many assumptions of too many programs would be broken. Even iOS with its retina displays actually does not really scale. The reason why Apple exactly doubles the resolution, while keeping the size the same, is that iOS now treats a square of four pixels as one pixel. Oh, the text is properly anti-aliased, and you can display icons, photos and videos accurately, but the default coordinate system is still the same as on the original iPhone or iPad. That makes it easier for applications to run unmodified. Of course, unmodified apps have low-res icons, and other graphics, so they only benefit when rendering text and vector graphics.

Fortunately, you can change the scaling in Quartz 2D, and optimize your code to retina displays. But you do that again by knowing what the screen resolution is, and the next step has to double the resolution again, and the programs have to be adapted again.

Apart from KDE 4 and Gnome 3, no UI has made the effort to go fully scalable, i.e. using vector graphics for the icons (pixel graphics will always have scaling issues), and actually honor the dpi information of the screen (Android has screen classes, so while Android apps are not automatically scaling to whatever screen there is, the situation is good enough). So at the moment, a high-res screen for a desktop PC would be only usable on a Linux system, which probably explains why nobody is making them - the market is too small.

My hope is that the new iPad puts enough pressure to make high-res screens with whatever stopgap technology you need (the iOS approach to just double the resolution, but don’t tell legacy programs should be good enough for Windows, too).


I think many, if not most, folks here are missing the fundamental point of the higher resolution display:

Couple this with a stylus, and you bet we’re in a whole new world of computing.

Current displays are not close enough to paper-resolution to make it good for anything other than sticky-notes. But the iPad 3… and similarly-spec’d devices… and I think we’re on to something.


The title is misleading. While the rise of different devices (more mobile)is evident, the static multi-purpose machine is still in great need. In fact, cell phones are getting bigger and more feature-rich to mimic what we get in a PC. So the PC metaphor is more valid than ever, the problem lays on the actual implementation, the actual PC machines.

About screens: hard core gamers try to partically solve the problem by using multiple monitors. Video cards like ADM/ATI Radeon HD6850 are designed to use all the space combined.


Monitors in laptops are in need of a refresh across the board - I fully agree, and one good takeaway from the launch of the ipad3 is that apple has put resolution centre-stage as a key metric by which consumers should measure devices. What features the masses are concerned with, manufacturers will build, and hopefully this push for higher resolution in devices will trickle down across the rest of digital device ecosphere over the next few years.

Luke W
Community Manager


Thanks for the link, and for using a quote from my blog in this well-written and accurate post. Only inaccuracy I could spot was my name. It’s Bill Hill, not Bill Hills :slight_smile:

I think you’re absolutely right, no matter what some of the commenters here believe. Higher resolution is a key computing advance.

Since personal computers first appeared, humans have had to adapt to their idiosyncrasies - and one of the least noticable impacts of low-res was that our brains had to perform lots of extra work doing the pixel interpolations needed to turn blocky asemblies of coarse pixels into text and pictures our brains understand.

With this breakthrough into higher resolution, and a much easier and more intuitive UI, Apple has adapted the computer to humans, instead of the other way around.

Of course, higher resolution has nothing to do with some arbitrary number like 1024 x 768, or 2560 x 1920. It’s about the number of pixels you pack into an inch.

Human vision has a vernier acuity of 600 pixels per inch (ppi). That’s edge detection. However, in practice, there’s a strong Law of Diminishing Returns, which means that the improvement a user sees starts to fall off dramatically by around 200ppi. Throwing more ppi at the screen brings scarely-noticable improvement. And the math is killer. To go from 100ppi to 200ppi means four times as many pixels to compute. You need a much faster, harder-working graphics card and that uses a lot more power. To go to 300 ppi is 9x, to 600ppi, 36x!

This killer math is why high-res displays made it onto mobile phones long before a 10" iPad.

I’'m amazed that Apple managed to double number of pixels in the iPad display, and still retain the same battery life (which I think is also key - a student or worker can use it for the entire day without requiring a power cord and an outlet).

Yes, more pixels per inch would be nice. But Apple has broken through a threshold with the new iPad, and I’m prepared to stay at this level forever if need be. And I’m certainly never going back!

I’ve written about all of these issues in other posts on my blog, The Future of Reading:


As you kindly say, I’ve been pioneering readability onscreen for a long time; I created my first eBook in 1985, when I wrote the user manual (remember those?) for Guide, the first Macintosh hypertext authoring program.

I’ve been a writer for some 56 years. And I’m writing this now on my iPad, which has just become the best and most flexible writing system I’ve ever used in my whole life - with the addition of an Apple wireless keyboard and an Origami iPad stand/keyboard case costing a total of $110.

I write about this in my latest post:


Once again, thanks for focusing on this topic.


I am not 100% in agreement with the reason for the iphone/ipad sales growth. It is not just because people felt pc’s are dead. The rapid adoption of the iphone and ipad compared to the macbooks is also due to the tight social global connections of people. Word travels quicker than earlier times. Hence adoption rates are also faster.
The mac os lion software has one of the fastest adoption rate. That is because they had used the power of internet to get the product easily to the consumer. Previous OS’s depended on physical cds and dvds to reach the consumer.


To echo and paraphrase the other comments: Saying “An iPad isn’t a valid substitute for a PC because you can’t write apps on it” is like saying “A Mini isn’t a valid way to commute because you can’t haul other cars with it.” Not everyone needs a computer to write other computer programs. The days where computer were primarily used by computer programmers are long gone.


Yes, I don’t know what game Windows or Microsoft is playing, that’s why there are so many people also going online to Online repair sites like Techie Now, pc ninja and what not, to fix their computer problem. I also own an ipad3 and the icons where stunning and alive. I was also amaze by Apple, since they don’t waste time and give what the consumers want, their idea plus the elegant innovation apple has, a truly Titan in the world of technology, also thanks to the man with the vision of it all, The Late Steve Jobs(RIP). His legacy will continue and his contribution to the world of technology has surpass any expectations. A true Prodigy.


So now we get busy figuring out how to make tablets into our developer platform for the early 21st century. Thats be so nice.


I see many people today looking at this in a very ‘black and white’ perspective - thundering headlines about the PC/Desktop being dead. The traditional PC won’t be going away for a few years yet. Anyone who works any kind of data entry job such as programming, finance, statistics of any type…the people that do the boots on the ground entry and work for it…they won’t be using a pad or mobile device of any type very soon to be doing their work. If they do, it will probably have a monitor, keyboard, and mouse attached to it, which would make it…yes, you guessed it, a desktop like computing machine :slight_smile:

I am sitting in an office next to a rack room that has a couple hundred computers in it. Not tablets. Not mobile phones. Servers in rackmount cases, many of which are KVM’d to other parts of this building running specialized software. The software won’t be created anytime soon for the mobile platforms, and those platforms don’t have the reliability and power yet anyway. My accounting department down the hall won’t be giving up their workstations anytime soon, at most sometime in the future they might be migrated to a cloud situation.

There is a lot more going on under the hood of our pretty little internet, WWW, and ‘connected world’ than a lot of people realize. The desktop computer will mutate and change eventually, once something appropriate to replace it has come along. People say keyboards are deadl. Again, I say ask anyone doing data entry. A good typist will blow the doors off anyone using voice recognition, and even with really good voice recog. software, privacy and confidentiality are an issue.

To the average user out there at the shallow end of the pool who uses an application or two, gets their email, and starts their vehicle from their mobile device, things may seem like a grand revolution. For those of us who have an idea of what is behind the scenes, we know it takes a little longer for things to really change. The real power is in the back room. It just isn’t visible to everyone.


I cannot believe that it is allready so long ago 1975 when gates started his empire…and the power is so large, like Google also very big company having much effect on global economics to.

Nice post


Just 8.5% of internet usage is mobile.

The web is above all a desktop platform, and will continue for the foreseeable future despite mobile web becoming more important. So much for "post PC.

The “post-PC era” is a myth. The term “post PC” is an official Apple propaganda term from a corporation which, after all these years, has still barely scraped past 5% market share in desktop.